[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 30529 1726882585.79101: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 30529 1726882585.79427: Added group all to inventory 30529 1726882585.79428: Added group ungrouped to inventory 30529 1726882585.79431: Group all now contains ungrouped 30529 1726882585.79433: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 30529 1726882585.88186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 30529 1726882585.88228: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 30529 1726882585.88245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 30529 1726882585.88282: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 30529 1726882585.88331: Loaded config def from plugin (inventory/script) 30529 1726882585.88332: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 30529 1726882585.88361: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 30529 1726882585.88418: Loaded config def from plugin (inventory/yaml) 30529 1726882585.88419: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 30529 1726882585.88478: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 30529 1726882585.88744: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 30529 1726882585.88746: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 30529 1726882585.88748: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 30529 1726882585.88753: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 30529 1726882585.88756: Loading data from /tmp/network-Kc3/inventory.yml 30529 1726882585.88801: /tmp/network-Kc3/inventory.yml was not parsable by auto 30529 1726882585.88841: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 30529 1726882585.88867: Loading data from /tmp/network-Kc3/inventory.yml 30529 1726882585.88922: group all already in inventory 30529 1726882585.88927: set inventory_file for managed_node1 30529 1726882585.88930: set inventory_dir for managed_node1 30529 1726882585.88930: Added host managed_node1 to inventory 30529 1726882585.88932: Added host managed_node1 to group all 30529 1726882585.88932: set ansible_host for managed_node1 30529 1726882585.88933: set ansible_ssh_extra_args for managed_node1 30529 1726882585.88935: set inventory_file for managed_node2 30529 1726882585.88936: set inventory_dir for managed_node2 30529 1726882585.88937: Added host managed_node2 to inventory 30529 1726882585.88938: Added host managed_node2 to group all 30529 1726882585.88938: set ansible_host for managed_node2 30529 1726882585.88939: set ansible_ssh_extra_args for managed_node2 30529 1726882585.88940: set inventory_file for managed_node3 30529 1726882585.88942: set inventory_dir for managed_node3 30529 1726882585.88942: Added host managed_node3 to inventory 30529 1726882585.88943: Added host managed_node3 to group all 30529 1726882585.88943: set ansible_host for managed_node3 30529 1726882585.88944: set ansible_ssh_extra_args for managed_node3 30529 1726882585.88945: Reconcile groups and hosts in inventory. 30529 1726882585.88948: Group ungrouped now contains managed_node1 30529 1726882585.88949: Group ungrouped now contains managed_node2 30529 1726882585.88950: Group ungrouped now contains managed_node3 30529 1726882585.89004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 30529 1726882585.89078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 30529 1726882585.89112: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 30529 1726882585.89130: Loaded config def from plugin (vars/host_group_vars) 30529 1726882585.89131: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 30529 1726882585.89136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 30529 1726882585.89141: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 30529 1726882585.89168: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 30529 1726882585.89398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882585.89462: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 30529 1726882585.89484: Loaded config def from plugin (connection/local) 30529 1726882585.89486: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 30529 1726882585.89859: Loaded config def from plugin (connection/paramiko_ssh) 30529 1726882585.89861: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 30529 1726882585.90405: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30529 1726882585.90432: Loaded config def from plugin (connection/psrp) 30529 1726882585.90434: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 30529 1726882585.90826: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30529 1726882585.90849: Loaded config def from plugin (connection/ssh) 30529 1726882585.90852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 30529 1726882585.92055: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30529 1726882585.92078: Loaded config def from plugin (connection/winrm) 30529 1726882585.92080: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 30529 1726882585.92102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 30529 1726882585.92143: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 30529 1726882585.92184: Loaded config def from plugin (shell/cmd) 30529 1726882585.92186: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 30529 1726882585.92204: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 30529 1726882585.92240: Loaded config def from plugin (shell/powershell) 30529 1726882585.92241: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 30529 1726882585.92279: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 30529 1726882585.92380: Loaded config def from plugin (shell/sh) 30529 1726882585.92382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 30529 1726882585.92405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 30529 1726882585.92474: Loaded config def from plugin (become/runas) 30529 1726882585.92475: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 30529 1726882585.92581: Loaded config def from plugin (become/su) 30529 1726882585.92582: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 30529 1726882585.92673: Loaded config def from plugin (become/sudo) 30529 1726882585.92675: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 30529 1726882585.92698: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30529 1726882585.92902: in VariableManager get_vars() 30529 1726882585.92914: done with get_vars() 30529 1726882585.92999: trying /usr/local/lib/python3.12/site-packages/ansible/modules 30529 1726882585.95019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 30529 1726882585.95088: in VariableManager get_vars() 30529 1726882585.95092: done with get_vars() 30529 1726882585.95095: variable 'playbook_dir' from source: magic vars 30529 1726882585.95096: variable 'ansible_playbook_python' from source: magic vars 30529 1726882585.95096: variable 'ansible_config_file' from source: magic vars 30529 1726882585.95097: variable 'groups' from source: magic vars 30529 1726882585.95097: variable 'omit' from source: magic vars 30529 1726882585.95098: variable 'ansible_version' from source: magic vars 30529 1726882585.95098: variable 'ansible_check_mode' from source: magic vars 30529 1726882585.95099: variable 'ansible_diff_mode' from source: magic vars 30529 1726882585.95099: variable 'ansible_forks' from source: magic vars 30529 1726882585.95099: variable 'ansible_inventory_sources' from source: magic vars 30529 1726882585.95100: variable 'ansible_skip_tags' from source: magic vars 30529 1726882585.95100: variable 'ansible_limit' from source: magic vars 30529 1726882585.95101: variable 'ansible_run_tags' from source: magic vars 30529 1726882585.95101: variable 'ansible_verbosity' from source: magic vars 30529 1726882585.95123: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 30529 1726882585.95505: in VariableManager get_vars() 30529 1726882585.95516: done with get_vars() 30529 1726882585.95545: in VariableManager get_vars() 30529 1726882585.95553: done with get_vars() 30529 1726882585.95583: in VariableManager get_vars() 30529 1726882585.95594: done with get_vars() 30529 1726882585.95624: in VariableManager get_vars() 30529 1726882585.95632: done with get_vars() 30529 1726882585.95661: in VariableManager get_vars() 30529 1726882585.95669: done with get_vars() 30529 1726882585.95703: in VariableManager get_vars() 30529 1726882585.95712: done with get_vars() 30529 1726882585.95746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 30529 1726882585.95755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 30529 1726882585.95912: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 30529 1726882585.96009: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 30529 1726882585.96012: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 30529 1726882585.96033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 30529 1726882585.96049: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 30529 1726882585.96146: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 30529 1726882585.96181: Loaded config def from plugin (callback/default) 30529 1726882585.96182: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30529 1726882585.97143: Loaded config def from plugin (callback/junit) 30529 1726882585.97146: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30529 1726882585.97187: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 30529 1726882585.97252: Loaded config def from plugin (callback/minimal) 30529 1726882585.97254: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30529 1726882585.97290: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30529 1726882585.97349: Loaded config def from plugin (callback/tree) 30529 1726882585.97351: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 30529 1726882585.97459: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 30529 1726882585.97462: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30529 1726882585.97489: in VariableManager get_vars() 30529 1726882585.97501: done with get_vars() 30529 1726882585.97507: in VariableManager get_vars() 30529 1726882585.97514: done with get_vars() 30529 1726882585.97518: variable 'omit' from source: magic vars 30529 1726882585.97552: in VariableManager get_vars() 30529 1726882585.97564: done with get_vars() 30529 1726882585.97583: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 30529 1726882585.99572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 30529 1726882585.99644: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 30529 1726882585.99678: getting the remaining hosts for this loop 30529 1726882585.99679: done getting the remaining hosts for this loop 30529 1726882585.99682: getting the next task for host managed_node1 30529 1726882585.99685: done getting next task for host managed_node1 30529 1726882585.99687: ^ task is: TASK: Gathering Facts 30529 1726882585.99689: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882585.99691: getting variables 30529 1726882585.99692: in VariableManager get_vars() 30529 1726882585.99703: Calling all_inventory to load vars for managed_node1 30529 1726882585.99706: Calling groups_inventory to load vars for managed_node1 30529 1726882585.99708: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882585.99720: Calling all_plugins_play to load vars for managed_node1 30529 1726882585.99730: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882585.99733: Calling groups_plugins_play to load vars for managed_node1 30529 1726882585.99762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882585.99811: done with get_vars() 30529 1726882585.99818: done getting variables 30529 1726882585.99877: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 21:36:25 -0400 (0:00:00.025) 0:00:00.025 ****** 30529 1726882585.99900: entering _queue_task() for managed_node1/gather_facts 30529 1726882585.99902: Creating lock for gather_facts 30529 1726882586.00247: worker is 1 (out of 1 available) 30529 1726882586.00258: exiting _queue_task() for managed_node1/gather_facts 30529 1726882586.00272: done queuing things up, now waiting for results queue to drain 30529 1726882586.00275: waiting for pending results... 30529 1726882586.00478: running TaskExecutor() for managed_node1/TASK: Gathering Facts 30529 1726882586.00638: in run() - task 12673a56-9f93-b0f1-edc0-00000000001b 30529 1726882586.00641: variable 'ansible_search_path' from source: unknown 30529 1726882586.00690: calling self._execute() 30529 1726882586.00768: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882586.00772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882586.00779: variable 'omit' from source: magic vars 30529 1726882586.00854: variable 'omit' from source: magic vars 30529 1726882586.00872: variable 'omit' from source: magic vars 30529 1726882586.00897: variable 'omit' from source: magic vars 30529 1726882586.00932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882586.00959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882586.00974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882586.00987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882586.01001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882586.01028: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882586.01032: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882586.01034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882586.01105: Set connection var ansible_shell_executable to /bin/sh 30529 1726882586.01109: Set connection var ansible_pipelining to False 30529 1726882586.01113: Set connection var ansible_shell_type to sh 30529 1726882586.01122: Set connection var ansible_timeout to 10 30529 1726882586.01124: Set connection var ansible_connection to ssh 30529 1726882586.01129: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882586.01145: variable 'ansible_shell_executable' from source: unknown 30529 1726882586.01147: variable 'ansible_connection' from source: unknown 30529 1726882586.01150: variable 'ansible_module_compression' from source: unknown 30529 1726882586.01152: variable 'ansible_shell_type' from source: unknown 30529 1726882586.01154: variable 'ansible_shell_executable' from source: unknown 30529 1726882586.01157: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882586.01162: variable 'ansible_pipelining' from source: unknown 30529 1726882586.01164: variable 'ansible_timeout' from source: unknown 30529 1726882586.01168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882586.01298: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 30529 1726882586.01306: variable 'omit' from source: magic vars 30529 1726882586.01310: starting attempt loop 30529 1726882586.01313: running the handler 30529 1726882586.01326: variable 'ansible_facts' from source: unknown 30529 1726882586.01342: _low_level_execute_command(): starting 30529 1726882586.01348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882586.01852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882586.01856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882586.01859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882586.01861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882586.01898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882586.01914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.01972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.03664: stdout chunk (state=3): >>>/root <<< 30529 1726882586.03899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.03902: stdout chunk (state=3): >>><<< 30529 1726882586.03905: stderr chunk (state=3): >>><<< 30529 1726882586.03907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882586.03909: _low_level_execute_command(): starting 30529 1726882586.03912: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662 `" && echo ansible-tmp-1726882586.038432-30541-269957735684662="` echo /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662 `" ) && sleep 0' 30529 1726882586.04699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882586.04703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882586.04733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.04829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.06732: stdout chunk (state=3): >>>ansible-tmp-1726882586.038432-30541-269957735684662=/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662 <<< 30529 1726882586.06874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.06896: stdout chunk (state=3): >>><<< 30529 1726882586.06913: stderr chunk (state=3): >>><<< 30529 1726882586.07098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882586.038432-30541-269957735684662=/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882586.07102: variable 'ansible_module_compression' from source: unknown 30529 1726882586.07105: ANSIBALLZ: Using generic lock for ansible.legacy.setup 30529 1726882586.07107: ANSIBALLZ: Acquiring lock 30529 1726882586.07109: ANSIBALLZ: Lock acquired: 139794692461328 30529 1726882586.07112: ANSIBALLZ: Creating module 30529 1726882586.38620: ANSIBALLZ: Writing module into payload 30529 1726882586.38876: ANSIBALLZ: Writing module 30529 1726882586.38984: ANSIBALLZ: Renaming module 30529 1726882586.39002: ANSIBALLZ: Done creating module 30529 1726882586.39098: variable 'ansible_facts' from source: unknown 30529 1726882586.39108: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882586.39174: _low_level_execute_command(): starting 30529 1726882586.39185: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 30529 1726882586.40503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882586.40574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882586.40683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882586.40720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.40810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.42508: stdout chunk (state=3): >>>PLATFORM <<< 30529 1726882586.42604: stdout chunk (state=3): >>>Linux <<< 30529 1726882586.42615: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 30529 1726882586.42909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.42947: stderr chunk (state=3): >>><<< 30529 1726882586.43036: stdout chunk (state=3): >>><<< 30529 1726882586.43039: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882586.43045 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 30529 1726882586.43281: _low_level_execute_command(): starting 30529 1726882586.43284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 30529 1726882586.43534: Sending initial data 30529 1726882586.43537: Sent initial data (1181 bytes) 30529 1726882586.44264: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882586.44625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.44702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.48079: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 30529 1726882586.48554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.48562: stdout chunk (state=3): >>><<< 30529 1726882586.48571: stderr chunk (state=3): >>><<< 30529 1726882586.48588: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882586.48679: variable 'ansible_facts' from source: unknown 30529 1726882586.48796: variable 'ansible_facts' from source: unknown 30529 1726882586.48815: variable 'ansible_module_compression' from source: unknown 30529 1726882586.48865: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30529 1726882586.48905: variable 'ansible_facts' from source: unknown 30529 1726882586.49117: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py 30529 1726882586.49302: Sending initial data 30529 1726882586.49391: Sent initial data (153 bytes) 30529 1726882586.50192: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882586.50255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882586.50341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882586.50372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882586.50442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.50464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.51975: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882586.51992: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882586.52218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882586.52282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp1hf_xb7o /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py <<< 30529 1726882586.52287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py" <<< 30529 1726882586.52362: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp1hf_xb7o" to remote "/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py" <<< 30529 1726882586.55127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.55177: stderr chunk (state=3): >>><<< 30529 1726882586.55398: stdout chunk (state=3): >>><<< 30529 1726882586.55405: done transferring module to remote 30529 1726882586.55407: _low_level_execute_command(): starting 30529 1726882586.55410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/ /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py && sleep 0' 30529 1726882586.56438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882586.56704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882586.56723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882586.56790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.58528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882586.58538: stdout chunk (state=3): >>><<< 30529 1726882586.58549: stderr chunk (state=3): >>><<< 30529 1726882586.58762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882586.58765: _low_level_execute_command(): starting 30529 1726882586.58768: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/AnsiballZ_setup.py && sleep 0' 30529 1726882586.59758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882586.59941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882586.60337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882586.60416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882586.62501: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30529 1726882586.62581: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 30529 1726882586.62643: stdout chunk (state=3): >>>import '_io' # <<< 30529 1726882586.62646: stdout chunk (state=3): >>>import 'marshal' # <<< 30529 1726882586.62684: stdout chunk (state=3): >>>import 'posix' # <<< 30529 1726882586.62819: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.62836: stdout chunk (state=3): >>>import '_codecs' # <<< 30529 1726882586.62848: stdout chunk (state=3): >>>import 'codecs' # <<< 30529 1726882586.62890: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30529 1726882586.62904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30529 1726882586.62928: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cbe7b30> <<< 30529 1726882586.63116: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 30529 1726882586.63167: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30529 1726882586.63201: stdout chunk (state=3): >>>import 'genericpath' # <<< 30529 1726882586.63208: stdout chunk (state=3): >>>import 'posixpath' # <<< 30529 1726882586.63224: stdout chunk (state=3): >>>import 'os' # <<< 30529 1726882586.63232: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 30529 1726882586.63277: stdout chunk (state=3): >>>Processing user site-packages <<< 30529 1726882586.63280: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 30529 1726882586.63413: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca09130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca09fa0> <<< 30529 1726882586.63441: stdout chunk (state=3): >>>import 'site' # <<< 30529 1726882586.63498: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30529 1726882586.63841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 30529 1726882586.63858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30529 1726882586.63897: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30529 1726882586.63909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.63911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30529 1726882586.63998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30529 1726882586.64001: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30529 1726882586.64115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca47da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca47fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30529 1726882586.64134: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30529 1726882586.64299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.64302: stdout chunk (state=3): >>>import 'itertools' # <<< 30529 1726882586.64304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 30529 1726882586.64306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca7f770> <<< 30529 1726882586.64308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 30529 1726882586.64309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 30529 1726882586.64311: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca7fe00> <<< 30529 1726882586.64313: stdout chunk (state=3): >>>import '_collections' # <<< 30529 1726882586.64337: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5fa40> <<< 30529 1726882586.64351: stdout chunk (state=3): >>>import '_functools' # <<< 30529 1726882586.64802: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5d160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca44f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 30529 1726882586.64808: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30529 1726882586.64811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 30529 1726882586.64814: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 30529 1726882586.64816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30529 1726882586.64818: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9f6b0> <<< 30529 1726882586.64820: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9cb60> <<< 30529 1726882586.64907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 30529 1726882586.64927: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca441d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899cad4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad4a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899cad4dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca42cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30529 1726882586.64959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30529 1726882586.64976: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad54c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad5190> <<< 30529 1726882586.64989: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 30529 1726882586.65015: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 30529 1726882586.65027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 30529 1726882586.65045: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad63c0> <<< 30529 1726882586.65196: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 30529 1726882586.65204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30529 1726882586.65207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30529 1726882586.65209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 30529 1726882586.65220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf05c0> <<< 30529 1726882586.65235: stdout chunk (state=3): >>>import 'errno' # <<< 30529 1726882586.65254: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf1d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30529 1726882586.65392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf2ba0> <<< 30529 1726882586.65530: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf3c80> <<< 30529 1726882586.65533: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf33b0> <<< 30529 1726882586.65535: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad6330> <<< 30529 1726882586.65538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30529 1726882586.65544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 30529 1726882586.65549: stdout chunk (state=3): >>> <<< 30529 1726882586.65557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30529 1726882586.65598: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c7e3bf0> <<< 30529 1726882586.65673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80c6e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80c440> <<< 30529 1726882586.65690: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80c710> <<< 30529 1726882586.65721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30529 1726882586.65796: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.65918: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80cfe0> <<< 30529 1726882586.66044: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80d9d0> <<< 30529 1726882586.66151: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80c890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c7e1d90> <<< 30529 1726882586.66181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80daf0> <<< 30529 1726882586.66185: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad6ae0> <<< 30529 1726882586.66212: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30529 1726882586.66290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.66295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30529 1726882586.66326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30529 1726882586.66400: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c837110> <<< 30529 1726882586.66492: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30529 1726882586.66510: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c85b470> <<< 30529 1726882586.66525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30529 1726882586.66570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30529 1726882586.66700: stdout chunk (state=3): >>>import 'ntpath' # <<< 30529 1726882586.66705: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 30529 1726882586.66708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8bc290> <<< 30529 1726882586.66711: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30529 1726882586.66725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30529 1726882586.66765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30529 1726882586.66854: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8be9f0> <<< 30529 1726882586.66925: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8bc3b0> <<< 30529 1726882586.66956: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c889280> <<< 30529 1726882586.66982: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1253d0> <<< 30529 1726882586.67016: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c85a270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80fce0> <<< 30529 1726882586.67180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30529 1726882586.67273: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f899c85a870> <<< 30529 1726882586.67525: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_a8ic8e4g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 30529 1726882586.67601: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.67627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30529 1726882586.67641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30529 1726882586.67675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30529 1726882586.67816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30529 1726882586.67819: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 30529 1726882586.67822: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c18b1a0> import '_typing' # <<< 30529 1726882586.67967: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c16a090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1691f0> # zipimport: zlib available <<< 30529 1726882586.67995: stdout chunk (state=3): >>>import 'ansible' # <<< 30529 1726882586.68016: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.68024: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.68081: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 30529 1726882586.69411: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.70535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c188e90> <<< 30529 1726882586.70644: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30529 1726882586.70654: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1baa80> <<< 30529 1726882586.70669: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba810> <<< 30529 1726882586.70705: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba150> <<< 30529 1726882586.70719: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 30529 1726882586.70759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30529 1726882586.70913: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba8a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c18be30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1bb7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1bb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 30529 1726882586.70979: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1bbf20> <<< 30529 1726882586.70983: stdout chunk (state=3): >>>import 'pwd' # <<< 30529 1726882586.70991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30529 1726882586.70996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30529 1726882586.71019: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c025ca0> <<< 30529 1726882586.71085: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.71091: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c0278c0> <<< 30529 1726882586.71095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30529 1726882586.71100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30529 1726882586.71316: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0291c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02bec0> <<< 30529 1726882586.71337: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.71413: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899ca42de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02a1b0> <<< 30529 1726882586.71421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30529 1726882586.71424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30529 1726882586.71427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 30529 1726882586.71429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30529 1726882586.71437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30529 1726882586.71536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30529 1726882586.71553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 30529 1726882586.71570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c033e30> <<< 30529 1726882586.71581: stdout chunk (state=3): >>>import '_tokenize' # <<< 30529 1726882586.71736: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032900> <<< 30529 1726882586.71740: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032660> <<< 30529 1726882586.71742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30529 1726882586.71748: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032bd0> <<< 30529 1726882586.71765: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02a6c0> <<< 30529 1726882586.71800: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c077fe0> <<< 30529 1726882586.72018: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c079c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c07c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07a2d0> <<< 30529 1726882586.72035: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30529 1726882586.72089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.72189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 30529 1726882586.72195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 30529 1726882586.72198: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07f950> <<< 30529 1726882586.72267: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07c320> <<< 30529 1726882586.72322: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.72400: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080a40> <<< 30529 1726882586.72403: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.72406: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080b90> <<< 30529 1726882586.72408: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.72614: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c078350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf08200> <<< 30529 1726882586.72654: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.72669: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf09460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c082990> <<< 30529 1726882586.72701: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.72718: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c083d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c082600> # zipimport: zlib available <<< 30529 1726882586.72730: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.72859: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 30529 1726882586.72864: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.72918: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.72933: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30529 1726882586.72950: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.72961: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.72980: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 30529 1726882586.72988: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.73204: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.73313: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.73731: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.74251: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30529 1726882586.74283: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 30529 1726882586.74299: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30529 1726882586.74312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.74353: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf11640> <<< 30529 1726882586.74437: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30529 1726882586.74500: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf12420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf096a0> <<< 30529 1726882586.74528: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30529 1726882586.74535: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.74624: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 30529 1726882586.74709: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.74949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 30529 1726882586.74952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf12330> # zipimport: zlib available <<< 30529 1726882586.75321: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.75792: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.75830: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.75906: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30529 1726882586.76107: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.76130: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.76200: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 30529 1726882586.76207: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30529 1726882586.76210: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.76216: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.76251: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30529 1726882586.76264: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.76482: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.76708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30529 1726882586.76760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30529 1726882586.77004: stdout chunk (state=3): >>>import '_ast' # <<< 30529 1726882586.77008: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf134d0> <<< 30529 1726882586.77011: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77013: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77015: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 30529 1726882586.77058: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 30529 1726882586.77147: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77191: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77245: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77425: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30529 1726882586.77443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf1e090> <<< 30529 1726882586.77462: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf19730> <<< 30529 1726882586.77481: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30529 1726882586.77496: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77557: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77621: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77648: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.77702: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.77790: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30529 1726882586.77795: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30529 1726882586.77798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30529 1726882586.77870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30529 1726882586.77882: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0069c0> <<< 30529 1726882586.77930: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0fe690> <<< 30529 1726882586.78011: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf1e1e0> <<< 30529 1726882586.78018: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf1de20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 30529 1726882586.78048: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78084: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30529 1726882586.78129: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30529 1726882586.78196: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78205: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 30529 1726882586.78232: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78298: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78408: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78425: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.78444: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 30529 1726882586.78491: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78559: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78634: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78670: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78775: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 30529 1726882586.78779: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.78862: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.79033: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.79070: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.79125: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882586.79149: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 30529 1726882586.79163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 30529 1726882586.79188: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 30529 1726882586.79216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 30529 1726882586.79219: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb20c0> <<< 30529 1726882586.79265: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 30529 1726882586.79272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 30529 1726882586.79399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc07f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.79406: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0c590> <<< 30529 1726882586.79597: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf9aea0> <<< 30529 1726882586.79604: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb2c60> <<< 30529 1726882586.79607: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb0770> <<< 30529 1726882586.79609: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb0b00> <<< 30529 1726882586.79611: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 30529 1726882586.79614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30529 1726882586.79643: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0eb40> <<< 30529 1726882586.79805: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0ed20> <<< 30529 1726882586.79814: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0df70> <<< 30529 1726882586.79839: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 30529 1726882586.79846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30529 1726882586.79880: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc71fa0> <<< 30529 1726882586.79913: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0ff80> <<< 30529 1726882586.79953: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb04a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 30529 1726882586.80057: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80060: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 30529 1726882586.80062: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80065: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 30529 1726882586.80132: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80191: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 30529 1726882586.80281: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80300: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.80338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 30529 1726882586.80389: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 30529 1726882586.80491: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80523: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30529 1726882586.80565: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80599: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.80833: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 30529 1726882586.81256: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.81661: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 30529 1726882586.81671: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.81722: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82102: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 30529 1726882586.82296: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82301: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 30529 1726882586.82305: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82307: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 30529 1726882586.82349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 30529 1726882586.82365: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc73830> <<< 30529 1726882586.82378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 30529 1726882586.82408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 30529 1726882586.82516: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc72a20> <<< 30529 1726882586.82598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 30529 1726882586.82613: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 30529 1726882586.82675: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82915: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 30529 1726882586.82926: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.82992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 30529 1726882586.83002: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.83402: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.83408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30529 1726882586.83420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30529 1726882586.83422: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.83428: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882586.83431: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bcae1b0> <<< 30529 1726882586.83433: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc9df70> <<< 30529 1726882586.83435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 30529 1726882586.83622: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 30529 1726882586.83729: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.83732: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.83829: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.83971: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 30529 1726882586.83985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 30529 1726882586.84020: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bcc1790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc9f170> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 30529 1726882586.84294: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 30529 1726882586.84298: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84300: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30529 1726882586.84347: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84498: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 30529 1726882586.84650: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84744: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84917: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.84936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 30529 1726882586.84948: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84963: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.84983: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.85121: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.85257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 30529 1726882586.85273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 30529 1726882586.85496: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.85511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 30529 1726882586.85521: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.85554: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.85589: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.86309: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.86689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 30529 1726882586.86914: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 30529 1726882586.86945: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 30529 1726882586.87056: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87203: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 30529 1726882586.87371: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87391: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 30529 1726882586.87402: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87435: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 30529 1726882586.87492: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87667: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87706: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.87873: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30529 1726882586.88156: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 30529 1726882586.88173: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88188: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30529 1726882586.88317: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88356: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 30529 1726882586.88388: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 30529 1726882586.88425: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88480: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88534: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 30529 1726882586.88577: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88606: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.88662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 30529 1726882586.88923: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 30529 1726882586.89244: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 30529 1726882586.89509: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 30529 1726882586.89634: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 30529 1726882586.89743: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89746: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 30529 1726882586.89749: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89776: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 30529 1726882586.89833: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89855: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.89998: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30529 1726882586.90033: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 30529 1726882586.90119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 30529 1726882586.90175: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 30529 1726882586.90302: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90417: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 30529 1726882586.90620: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90656: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 30529 1726882586.90714: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90843: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 30529 1726882586.90885: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.90965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 30529 1726882586.90999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 30529 1726882586.91067: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.91153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30529 1726882586.91291: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882586.92202: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 30529 1726882586.92413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30529 1726882586.92417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 30529 1726882586.92419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30529 1726882586.92421: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bacac90> <<< 30529 1726882586.92423: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bac8e30> <<< 30529 1726882586.92425: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bac8bc0> <<< 30529 1726882587.04649: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 30529 1726882587.04668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 30529 1726882587.04829: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb10380> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb11280> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 30529 1726882587.04835: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb13590> <<< 30529 1726882587.04870: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb125d0> <<< 30529 1726882587.05092: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 30529 1726882587.29048: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_<<< 30529 1726882587.29088: stdout chunk (state=3): >>>SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.925982Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626925982", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.3984375, "15m": 0.224609375}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1020, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789773824, "block_size": 4096, "block_total": 65519099, "block_available": 63913519, "block_used": 1605580, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30529 1726882587.29737: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 30529 1726882587.29757: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 30529 1726882587.29839: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 30529 1726882587.29870: stdout chunk (state=3): >>># cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 30529 1726882587.29953: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 30529 1726882587.30056: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime <<< 30529 1726882587.30074: stdout chunk (state=3): >>># cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 30529 1726882587.30241: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi <<< 30529 1726882587.30249: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual <<< 30529 1726882587.30275: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline <<< 30529 1726882587.30295: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 30529 1726882587.30550: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30529 1726882587.30568: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30529 1726882587.30658: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 30529 1726882587.30667: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 30529 1726882587.30700: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 30529 1726882587.30727: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 30529 1726882587.30742: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 30529 1726882587.30764: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 30529 1726882587.30828: stdout chunk (state=3): >>># destroy selinux <<< 30529 1726882587.30836: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro <<< 30529 1726882587.30976: stdout chunk (state=3): >>># destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 30529 1726882587.30979: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 30529 1726882587.30982: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 30529 1726882587.31018: stdout chunk (state=3): >>># destroy datetime # destroy subprocess <<< 30529 1726882587.31046: stdout chunk (state=3): >>># destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct <<< 30529 1726882587.31065: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno <<< 30529 1726882587.31151: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon <<< 30529 1726882587.31161: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 30529 1726882587.31197: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 30529 1726882587.31347: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os<<< 30529 1726882587.31364: stdout chunk (state=3): >>> # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30529 1726882587.31541: stdout chunk (state=3): >>># destroy sys.monitoring <<< 30529 1726882587.31548: stdout chunk (state=3): >>># destroy _socket <<< 30529 1726882587.31551: stdout chunk (state=3): >>># destroy _collections <<< 30529 1726882587.31554: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 30529 1726882587.31810: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 30529 1726882587.31841: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 30529 1726882587.31870: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 30529 1726882587.31876: stdout chunk (state=3): >>># clear sys.audit hooks <<< 30529 1726882587.32239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882587.32242: stdout chunk (state=3): >>><<< 30529 1726882587.32244: stderr chunk (state=3): >>><<< 30529 1726882587.32381: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca09130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca09fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca47da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca47fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca7f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca7fe00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5fa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5d160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca44f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9f6b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca5e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca9cb60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca441d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899cad4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad4a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899cad4dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899ca42cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad54c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad5190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad63c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf05c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf1d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf2ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899caf3c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899caf33b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad6330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c7e3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80c6e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80c440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80c710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c80d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80c890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c7e1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80daf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899cad6ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c837110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c85b470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8bc290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8be9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c8bc3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c889280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c85a270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c80fce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f899c85a870> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_a8ic8e4g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c18b1a0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c16a090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1691f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c188e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1baa80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1ba8a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c18be30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1bb7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c1bb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c1bbf20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c025ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c0278c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0291c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02bec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899ca42de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02a1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c033e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c032bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c02a6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c077fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c079c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c07c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07f950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c07c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c080ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c078350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf08200> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf09460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c082990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899c083d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c082600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf11640> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf12420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf096a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf12330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf134d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bf1e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf19730> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0069c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899c0fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf1e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf1de20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc07f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0c590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bf9aea0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb2c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb0b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0eb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc0ed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bc71fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc0ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bfb04a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc73830> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc72a20> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bcae1b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc9df70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bcc1790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bc9f170> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f899bacac90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bac8e30> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bac8bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb10380> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb11280> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb13590> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f899bb125d0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.925982Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626925982", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.3984375, "15m": 0.224609375}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1020, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789773824, "block_size": 4096, "block_total": 65519099, "block_available": 63913519, "block_used": 1605580, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 30529 1726882587.34306: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882587.34309: _low_level_execute_command(): starting 30529 1726882587.34312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882586.038432-30541-269957735684662/ > /dev/null 2>&1 && sleep 0' 30529 1726882587.34583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.34667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882587.34742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882587.34787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882587.36684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882587.36690: stdout chunk (state=3): >>><<< 30529 1726882587.36760: stderr chunk (state=3): >>><<< 30529 1726882587.36766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882587.36768: handler run complete 30529 1726882587.37016: variable 'ansible_facts' from source: unknown 30529 1726882587.37205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.37808: variable 'ansible_facts' from source: unknown 30529 1726882587.38001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.38233: attempt loop complete, returning result 30529 1726882587.38237: _execute() done 30529 1726882587.38239: dumping result to json 30529 1726882587.38269: done dumping result, returning 30529 1726882587.38276: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-b0f1-edc0-00000000001b] 30529 1726882587.38279: sending task result for task 12673a56-9f93-b0f1-edc0-00000000001b 30529 1726882587.38879: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000001b 30529 1726882587.38882: WORKER PROCESS EXITING ok: [managed_node1] 30529 1726882587.39479: no more pending results, returning what we have 30529 1726882587.39483: results queue empty 30529 1726882587.39483: checking for any_errors_fatal 30529 1726882587.39485: done checking for any_errors_fatal 30529 1726882587.39488: checking for max_fail_percentage 30529 1726882587.39490: done checking for max_fail_percentage 30529 1726882587.39491: checking to see if all hosts have failed and the running result is not ok 30529 1726882587.39492: done checking to see if all hosts have failed 30529 1726882587.39495: getting the remaining hosts for this loop 30529 1726882587.39496: done getting the remaining hosts for this loop 30529 1726882587.39500: getting the next task for host managed_node1 30529 1726882587.39506: done getting next task for host managed_node1 30529 1726882587.39508: ^ task is: TASK: meta (flush_handlers) 30529 1726882587.39554: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882587.39559: getting variables 30529 1726882587.39560: in VariableManager get_vars() 30529 1726882587.39630: Calling all_inventory to load vars for managed_node1 30529 1726882587.39634: Calling groups_inventory to load vars for managed_node1 30529 1726882587.39637: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.39647: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.39649: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.39653: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.40127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.40532: done with get_vars() 30529 1726882587.40542: done getting variables 30529 1726882587.40723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 30529 1726882587.40775: in VariableManager get_vars() 30529 1726882587.40783: Calling all_inventory to load vars for managed_node1 30529 1726882587.40785: Calling groups_inventory to load vars for managed_node1 30529 1726882587.40787: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.40791: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.40794: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.40797: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.41061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.41451: done with get_vars() 30529 1726882587.41464: done queuing things up, now waiting for results queue to drain 30529 1726882587.41465: results queue empty 30529 1726882587.41466: checking for any_errors_fatal 30529 1726882587.41468: done checking for any_errors_fatal 30529 1726882587.41469: checking for max_fail_percentage 30529 1726882587.41470: done checking for max_fail_percentage 30529 1726882587.41471: checking to see if all hosts have failed and the running result is not ok 30529 1726882587.41477: done checking to see if all hosts have failed 30529 1726882587.41477: getting the remaining hosts for this loop 30529 1726882587.41478: done getting the remaining hosts for this loop 30529 1726882587.41481: getting the next task for host managed_node1 30529 1726882587.41485: done getting next task for host managed_node1 30529 1726882587.41487: ^ task is: TASK: Include the task 'el_repo_setup.yml' 30529 1726882587.41488: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882587.41490: getting variables 30529 1726882587.41491: in VariableManager get_vars() 30529 1726882587.41602: Calling all_inventory to load vars for managed_node1 30529 1726882587.41605: Calling groups_inventory to load vars for managed_node1 30529 1726882587.41607: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.41612: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.41615: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.41622: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.41923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.42384: done with get_vars() 30529 1726882587.42392: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 21:36:27 -0400 (0:00:01.425) 0:00:01.450 ****** 30529 1726882587.42465: entering _queue_task() for managed_node1/include_tasks 30529 1726882587.42467: Creating lock for include_tasks 30529 1726882587.42777: worker is 1 (out of 1 available) 30529 1726882587.42789: exiting _queue_task() for managed_node1/include_tasks 30529 1726882587.42930: done queuing things up, now waiting for results queue to drain 30529 1726882587.42932: waiting for pending results... 30529 1726882587.43508: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 30529 1726882587.43513: in run() - task 12673a56-9f93-b0f1-edc0-000000000006 30529 1726882587.43515: variable 'ansible_search_path' from source: unknown 30529 1726882587.43517: calling self._execute() 30529 1726882587.43519: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882587.43521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882587.43523: variable 'omit' from source: magic vars 30529 1726882587.43525: _execute() done 30529 1726882587.43526: dumping result to json 30529 1726882587.43528: done dumping result, returning 30529 1726882587.43530: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-b0f1-edc0-000000000006] 30529 1726882587.43532: sending task result for task 12673a56-9f93-b0f1-edc0-000000000006 30529 1726882587.43598: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000006 30529 1726882587.43601: WORKER PROCESS EXITING 30529 1726882587.43635: no more pending results, returning what we have 30529 1726882587.43639: in VariableManager get_vars() 30529 1726882587.43663: Calling all_inventory to load vars for managed_node1 30529 1726882587.43665: Calling groups_inventory to load vars for managed_node1 30529 1726882587.43668: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.43676: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.43679: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.43684: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.43865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.44095: done with get_vars() 30529 1726882587.44102: variable 'ansible_search_path' from source: unknown 30529 1726882587.44114: we have included files to process 30529 1726882587.44115: generating all_blocks data 30529 1726882587.44117: done generating all_blocks data 30529 1726882587.44118: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30529 1726882587.44119: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30529 1726882587.44121: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30529 1726882587.44794: in VariableManager get_vars() 30529 1726882587.44810: done with get_vars() 30529 1726882587.44822: done processing included file 30529 1726882587.44824: iterating over new_blocks loaded from include file 30529 1726882587.44825: in VariableManager get_vars() 30529 1726882587.44834: done with get_vars() 30529 1726882587.44836: filtering new block on tags 30529 1726882587.44848: done filtering new block on tags 30529 1726882587.44851: in VariableManager get_vars() 30529 1726882587.44860: done with get_vars() 30529 1726882587.44861: filtering new block on tags 30529 1726882587.44875: done filtering new block on tags 30529 1726882587.44878: in VariableManager get_vars() 30529 1726882587.44887: done with get_vars() 30529 1726882587.44888: filtering new block on tags 30529 1726882587.44906: done filtering new block on tags 30529 1726882587.44908: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 30529 1726882587.44914: extending task lists for all hosts with included blocks 30529 1726882587.44959: done extending task lists 30529 1726882587.44960: done processing included files 30529 1726882587.44961: results queue empty 30529 1726882587.44961: checking for any_errors_fatal 30529 1726882587.44962: done checking for any_errors_fatal 30529 1726882587.44963: checking for max_fail_percentage 30529 1726882587.44964: done checking for max_fail_percentage 30529 1726882587.44965: checking to see if all hosts have failed and the running result is not ok 30529 1726882587.44966: done checking to see if all hosts have failed 30529 1726882587.44966: getting the remaining hosts for this loop 30529 1726882587.44968: done getting the remaining hosts for this loop 30529 1726882587.44970: getting the next task for host managed_node1 30529 1726882587.44974: done getting next task for host managed_node1 30529 1726882587.44976: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 30529 1726882587.44978: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882587.44979: getting variables 30529 1726882587.44980: in VariableManager get_vars() 30529 1726882587.44988: Calling all_inventory to load vars for managed_node1 30529 1726882587.44989: Calling groups_inventory to load vars for managed_node1 30529 1726882587.44991: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.44997: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.44999: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.45001: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.45134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.45373: done with get_vars() 30529 1726882587.45382: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:36:27 -0400 (0:00:00.029) 0:00:01.480 ****** 30529 1726882587.45452: entering _queue_task() for managed_node1/setup 30529 1726882587.45911: worker is 1 (out of 1 available) 30529 1726882587.45918: exiting _queue_task() for managed_node1/setup 30529 1726882587.45928: done queuing things up, now waiting for results queue to drain 30529 1726882587.45929: waiting for pending results... 30529 1726882587.46274: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 30529 1726882587.46280: in run() - task 12673a56-9f93-b0f1-edc0-00000000002c 30529 1726882587.46283: variable 'ansible_search_path' from source: unknown 30529 1726882587.46286: variable 'ansible_search_path' from source: unknown 30529 1726882587.46288: calling self._execute() 30529 1726882587.46404: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882587.46411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882587.46415: variable 'omit' from source: magic vars 30529 1726882587.46900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882587.49099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882587.49243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882587.49246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882587.49420: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882587.49449: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882587.49559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882587.49615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882587.49676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882587.49787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882587.50129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882587.50188: variable 'ansible_facts' from source: unknown 30529 1726882587.50369: variable 'network_test_required_facts' from source: task vars 30529 1726882587.50411: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 30529 1726882587.50420: when evaluation is False, skipping this task 30529 1726882587.50427: _execute() done 30529 1726882587.50433: dumping result to json 30529 1726882587.50439: done dumping result, returning 30529 1726882587.50451: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-b0f1-edc0-00000000002c] 30529 1726882587.50464: sending task result for task 12673a56-9f93-b0f1-edc0-00000000002c 30529 1726882587.50747: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000002c 30529 1726882587.50751: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 30529 1726882587.50850: no more pending results, returning what we have 30529 1726882587.50853: results queue empty 30529 1726882587.50854: checking for any_errors_fatal 30529 1726882587.50855: done checking for any_errors_fatal 30529 1726882587.50856: checking for max_fail_percentage 30529 1726882587.50858: done checking for max_fail_percentage 30529 1726882587.50859: checking to see if all hosts have failed and the running result is not ok 30529 1726882587.50860: done checking to see if all hosts have failed 30529 1726882587.50861: getting the remaining hosts for this loop 30529 1726882587.50862: done getting the remaining hosts for this loop 30529 1726882587.50866: getting the next task for host managed_node1 30529 1726882587.50877: done getting next task for host managed_node1 30529 1726882587.50880: ^ task is: TASK: Check if system is ostree 30529 1726882587.50882: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882587.50886: getting variables 30529 1726882587.50894: in VariableManager get_vars() 30529 1726882587.50924: Calling all_inventory to load vars for managed_node1 30529 1726882587.50926: Calling groups_inventory to load vars for managed_node1 30529 1726882587.50930: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882587.50941: Calling all_plugins_play to load vars for managed_node1 30529 1726882587.50944: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882587.50946: Calling groups_plugins_play to load vars for managed_node1 30529 1726882587.51342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882587.51938: done with get_vars() 30529 1726882587.51949: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:36:27 -0400 (0:00:00.067) 0:00:01.547 ****** 30529 1726882587.52189: entering _queue_task() for managed_node1/stat 30529 1726882587.52720: worker is 1 (out of 1 available) 30529 1726882587.52733: exiting _queue_task() for managed_node1/stat 30529 1726882587.52744: done queuing things up, now waiting for results queue to drain 30529 1726882587.52745: waiting for pending results... 30529 1726882587.53242: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 30529 1726882587.53398: in run() - task 12673a56-9f93-b0f1-edc0-00000000002e 30529 1726882587.53669: variable 'ansible_search_path' from source: unknown 30529 1726882587.53672: variable 'ansible_search_path' from source: unknown 30529 1726882587.53675: calling self._execute() 30529 1726882587.53715: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882587.53788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882587.53806: variable 'omit' from source: magic vars 30529 1726882587.54818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882587.55332: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882587.55379: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882587.55445: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882587.55557: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882587.55848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882587.56064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882587.56068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882587.56070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882587.56281: Evaluated conditional (not __network_is_ostree is defined): True 30529 1726882587.56710: variable 'omit' from source: magic vars 30529 1726882587.56714: variable 'omit' from source: magic vars 30529 1726882587.56716: variable 'omit' from source: magic vars 30529 1726882587.56719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882587.56955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882587.57037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882587.57074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882587.57091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882587.57209: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882587.57263: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882587.57272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882587.57528: Set connection var ansible_shell_executable to /bin/sh 30529 1726882587.57800: Set connection var ansible_pipelining to False 30529 1726882587.57803: Set connection var ansible_shell_type to sh 30529 1726882587.57806: Set connection var ansible_timeout to 10 30529 1726882587.57808: Set connection var ansible_connection to ssh 30529 1726882587.57810: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882587.57812: variable 'ansible_shell_executable' from source: unknown 30529 1726882587.57814: variable 'ansible_connection' from source: unknown 30529 1726882587.57816: variable 'ansible_module_compression' from source: unknown 30529 1726882587.57818: variable 'ansible_shell_type' from source: unknown 30529 1726882587.57820: variable 'ansible_shell_executable' from source: unknown 30529 1726882587.57822: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882587.57824: variable 'ansible_pipelining' from source: unknown 30529 1726882587.57827: variable 'ansible_timeout' from source: unknown 30529 1726882587.57829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882587.58314: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882587.58357: variable 'omit' from source: magic vars 30529 1726882587.58384: starting attempt loop 30529 1726882587.58485: running the handler 30529 1726882587.58488: _low_level_execute_command(): starting 30529 1726882587.58491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882587.59861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882587.60055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.60146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882587.60168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882587.60181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882587.60260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882587.61860: stdout chunk (state=3): >>>/root <<< 30529 1726882587.61996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882587.62003: stdout chunk (state=3): >>><<< 30529 1726882587.62010: stderr chunk (state=3): >>><<< 30529 1726882587.62030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882587.62043: _low_level_execute_command(): starting 30529 1726882587.62049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454 `" && echo ansible-tmp-1726882587.6203003-30580-7876301810454="` echo /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454 `" ) && sleep 0' 30529 1726882587.63259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882587.63265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882587.63268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.63270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882587.63272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.63422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882587.63438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882587.65281: stdout chunk (state=3): >>>ansible-tmp-1726882587.6203003-30580-7876301810454=/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454 <<< 30529 1726882587.65439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882587.65442: stderr chunk (state=3): >>><<< 30529 1726882587.65447: stdout chunk (state=3): >>><<< 30529 1726882587.65453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882587.6203003-30580-7876301810454=/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882587.65544: variable 'ansible_module_compression' from source: unknown 30529 1726882587.65559: ANSIBALLZ: Using lock for stat 30529 1726882587.65566: ANSIBALLZ: Acquiring lock 30529 1726882587.65568: ANSIBALLZ: Lock acquired: 139794692461904 30529 1726882587.65570: ANSIBALLZ: Creating module 30529 1726882587.90544: ANSIBALLZ: Writing module into payload 30529 1726882587.90753: ANSIBALLZ: Writing module 30529 1726882587.90817: ANSIBALLZ: Renaming module 30529 1726882587.90820: ANSIBALLZ: Done creating module 30529 1726882587.91014: variable 'ansible_facts' from source: unknown 30529 1726882587.91089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py 30529 1726882587.91491: Sending initial data 30529 1726882587.91496: Sent initial data (151 bytes) 30529 1726882587.92681: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882587.92685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882587.92687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882587.92689: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882587.92691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.92695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882587.92697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882587.92766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882587.92838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882587.94450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882587.94515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882587.94582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppftc9vhc /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py <<< 30529 1726882587.94585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py" <<< 30529 1726882587.94647: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppftc9vhc" to remote "/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py" <<< 30529 1726882587.95664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882587.95907: stderr chunk (state=3): >>><<< 30529 1726882587.95911: stdout chunk (state=3): >>><<< 30529 1726882587.95955: done transferring module to remote 30529 1726882587.95958: _low_level_execute_command(): starting 30529 1726882587.95960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/ /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py && sleep 0' 30529 1726882587.96717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882587.96742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882587.96745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.96748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882587.96750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882587.96763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.96811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882587.96826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882587.96870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882587.98939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882587.98942: stdout chunk (state=3): >>><<< 30529 1726882587.98944: stderr chunk (state=3): >>><<< 30529 1726882587.98946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882587.98948: _low_level_execute_command(): starting 30529 1726882587.98950: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/AnsiballZ_stat.py && sleep 0' 30529 1726882587.99855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882587.99870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882587.99883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882587.99939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882587.99971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.00126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.02147: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30529 1726882588.02183: stdout chunk (state=3): >>>import _imp # builtin <<< 30529 1726882588.02213: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 30529 1726882588.02283: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30529 1726882588.02314: stdout chunk (state=3): >>>import 'posix' # <<< 30529 1726882588.02342: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 30529 1726882588.02376: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 30529 1726882588.02388: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 30529 1726882588.02435: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 30529 1726882588.02453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 30529 1726882588.02472: stdout chunk (state=3): >>>import 'codecs' # <<< 30529 1726882588.02530: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30529 1726882588.02559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30529 1726882588.02579: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 30529 1726882588.02602: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fbea50> import '_signal' # <<< 30529 1726882588.02638: stdout chunk (state=3): >>>import '_abc' # <<< 30529 1726882588.02664: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 30529 1726882588.02692: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30529 1726882588.02807: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30529 1726882588.02836: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # <<< 30529 1726882588.02869: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 30529 1726882588.02922: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30529 1726882588.02932: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fcd130> <<< 30529 1726882588.03077: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30529 1726882588.03318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30529 1726882588.03338: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30529 1726882588.03384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30529 1726882588.03423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30529 1726882588.03426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 30529 1726882588.03464: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dcbe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 30529 1726882588.03487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30529 1726882588.03491: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dcbef0> <<< 30529 1726882588.03576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30529 1726882588.03580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 30529 1726882588.03582: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30529 1726882588.03678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.03681: stdout chunk (state=3): >>>import 'itertools' # <<< 30529 1726882588.03688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e03860> <<< 30529 1726882588.03747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e03ef0> import '_collections' # <<< 30529 1726882588.03752: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de3b30> <<< 30529 1726882588.03804: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de1220> <<< 30529 1726882588.03925: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc9010> <<< 30529 1726882588.04011: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 30529 1726882588.04028: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30529 1726882588.04098: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e223c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 30529 1726882588.04101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dca8d0> <<< 30529 1726882588.04135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 30529 1726882588.04166: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30529 1726882588.04217: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.04253: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e58c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e58b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e58ec0> <<< 30529 1726882588.04273: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc6db0> <<< 30529 1726882588.04299: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.04302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30529 1726882588.04374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30529 1726882588.04379: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e59580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e59250> <<< 30529 1726882588.04382: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 30529 1726882588.04384: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 30529 1726882588.04402: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5a450> <<< 30529 1726882588.04423: stdout chunk (state=3): >>>import 'importlib.util' # <<< 30529 1726882588.04445: stdout chunk (state=3): >>>import 'runpy' # <<< 30529 1726882588.04449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30529 1726882588.04478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30529 1726882588.04522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 30529 1726882588.04526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e70680> import 'errno' # <<< 30529 1726882588.04625: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e71d30> <<< 30529 1726882588.04628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 30529 1726882588.04630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30529 1726882588.04633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 30529 1726882588.04635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e72bd0> <<< 30529 1726882588.04697: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e73230> <<< 30529 1726882588.04700: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e72120> <<< 30529 1726882588.04713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30529 1726882588.04751: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.04767: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e73cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e733e0> <<< 30529 1726882588.04817: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5a4b0> <<< 30529 1726882588.04850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30529 1726882588.04855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30529 1726882588.04879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30529 1726882588.04913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30529 1726882588.04935: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c07b90> <<< 30529 1726882588.04978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.04982: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30620> <<< 30529 1726882588.05006: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c30380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30650> <<< 30529 1726882588.05045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30529 1726882588.05102: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.05220: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30f80> <<< 30529 1726882588.05338: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.05340: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.05364: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c318b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c30830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c05d30> <<< 30529 1726882588.05373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30529 1726882588.05398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 30529 1726882588.05416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 30529 1726882588.05427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 30529 1726882588.05437: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c32c90> <<< 30529 1726882588.05465: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c319d0> <<< 30529 1726882588.05475: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5aba0> <<< 30529 1726882588.05501: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30529 1726882588.05563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.05573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30529 1726882588.05610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30529 1726882588.05633: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c5b020> <<< 30529 1726882588.05688: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30529 1726882588.05698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.05720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30529 1726882588.05737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30529 1726882588.05778: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c7f380> <<< 30529 1726882588.05803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30529 1726882588.05843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30529 1726882588.05904: stdout chunk (state=3): >>>import 'ntpath' # <<< 30529 1726882588.05922: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce0110> <<< 30529 1726882588.05942: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30529 1726882588.05966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30529 1726882588.05992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30529 1726882588.06029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30529 1726882588.06114: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce2870> <<< 30529 1726882588.06180: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce0230> <<< 30529 1726882588.06221: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13cad130> <<< 30529 1726882588.06254: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 30529 1726882588.06265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13aed220> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c7e180> <<< 30529 1726882588.06268: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c33bf0> <<< 30529 1726882588.06376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30529 1726882588.06402: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fea13c7e780> <<< 30529 1726882588.06585: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_uva5jb2f/ansible_stat_payload.zip' <<< 30529 1726882588.06592: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.06715: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.06745: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 30529 1726882588.06753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30529 1726882588.06807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30529 1726882588.06938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30529 1726882588.06941: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3ef90> import '_typing' # <<< 30529 1726882588.07129: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b1de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b1d010> # zipimport: zlib available <<< 30529 1726882588.07154: stdout chunk (state=3): >>>import 'ansible' # <<< 30529 1726882588.07176: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.07201: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.07228: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 30529 1726882588.07239: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.08604: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.09710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 30529 1726882588.09718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3ce30> <<< 30529 1726882588.09737: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 30529 1726882588.09743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.09765: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 30529 1726882588.09779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30529 1726882588.09804: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30529 1726882588.09835: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.09840: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6e8d0> <<< 30529 1726882588.09870: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6e660> <<< 30529 1726882588.09906: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6df70> <<< 30529 1726882588.09927: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 30529 1726882588.09937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30529 1726882588.09973: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6e9c0> <<< 30529 1726882588.09979: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3f9b0> <<< 30529 1726882588.09984: stdout chunk (state=3): >>>import 'atexit' # <<< 30529 1726882588.10015: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6f650> <<< 30529 1726882588.10043: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6f890> <<< 30529 1726882588.10068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30529 1726882588.10110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 30529 1726882588.10123: stdout chunk (state=3): >>>import '_locale' # <<< 30529 1726882588.10167: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6fdd0> <<< 30529 1726882588.10183: stdout chunk (state=3): >>>import 'pwd' # <<< 30529 1726882588.10197: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30529 1726882588.10224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30529 1726882588.10259: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1350db20> <<< 30529 1726882588.10286: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1350f740> <<< 30529 1726882588.10313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30529 1726882588.10329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30529 1726882588.10372: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13510140> <<< 30529 1726882588.10383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30529 1726882588.10417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 30529 1726882588.10432: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135112e0> <<< 30529 1726882588.10461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30529 1726882588.10490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 30529 1726882588.10513: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30529 1726882588.10568: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13513d70> <<< 30529 1726882588.10605: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.10614: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b1ef60> <<< 30529 1726882588.10624: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13512030> <<< 30529 1726882588.10649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30529 1726882588.10677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30529 1726882588.10700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30529 1726882588.10722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30529 1726882588.10752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30529 1726882588.10778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 30529 1726882588.10782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 30529 1726882588.10804: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351bcb0> <<< 30529 1726882588.10809: stdout chunk (state=3): >>>import '_tokenize' # <<< 30529 1726882588.10873: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351a780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351a4e0> <<< 30529 1726882588.10910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 30529 1726882588.10913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30529 1726882588.10983: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351aa50> <<< 30529 1726882588.11018: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13512540> <<< 30529 1726882588.11043: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11048: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13563fe0> <<< 30529 1726882588.11072: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 30529 1726882588.11077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135640e0> <<< 30529 1726882588.11097: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 30529 1726882588.11119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 30529 1726882588.11141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 30529 1726882588.11146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 30529 1726882588.11180: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11183: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13565b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13565940> <<< 30529 1726882588.11208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 30529 1726882588.11306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30529 1726882588.11356: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13568170> <<< 30529 1726882588.11360: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13566270> <<< 30529 1726882588.11385: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30529 1726882588.11419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.11450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 30529 1726882588.11457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30529 1726882588.11470: stdout chunk (state=3): >>>import '_string' # <<< 30529 1726882588.11507: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356b950> <<< 30529 1726882588.11628: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13568320> <<< 30529 1726882588.11691: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11696: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356c710> <<< 30529 1726882588.11722: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11727: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356c7d0> <<< 30529 1726882588.11771: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11774: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356ca10> <<< 30529 1726882588.11787: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13564290> <<< 30529 1726882588.11809: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30529 1726882588.11828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30529 1726882588.11847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 30529 1726882588.11881: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.11907: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135f8230> <<< 30529 1726882588.12048: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30529 1726882588.12054: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135f9700> <<< 30529 1726882588.12073: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356e9f0> <<< 30529 1726882588.12112: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356fd70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356e630> <<< 30529 1726882588.12126: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12144: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12152: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 30529 1726882588.12163: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12254: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12339: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12356: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30529 1726882588.12377: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12390: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 30529 1726882588.12410: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12533: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.12649: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.13182: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.13727: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30529 1726882588.13731: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 30529 1726882588.13733: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 30529 1726882588.13759: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 30529 1726882588.13770: stdout chunk (state=3): >>> <<< 30529 1726882588.13773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.13829: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135fd970> <<< 30529 1726882588.13908: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 30529 1726882588.13914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30529 1726882588.13930: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135fe7b0> <<< 30529 1726882588.13945: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135f9550> <<< 30529 1726882588.13987: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30529 1726882588.14000: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.14018: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.14038: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 30529 1726882588.14046: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.14195: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.14347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 30529 1726882588.14352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30529 1726882588.14367: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135fe4b0> <<< 30529 1726882588.14381: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.14826: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15271: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15342: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15417: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30529 1726882588.15420: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15467: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15502: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30529 1726882588.15517: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15579: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15668: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30529 1726882588.15676: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15698: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 30529 1726882588.15710: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15752: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.15794: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30529 1726882588.15798: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16027: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30529 1726882588.16317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30529 1726882588.16320: stdout chunk (state=3): >>>import '_ast' # <<< 30529 1726882588.16395: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135ff9b0> <<< 30529 1726882588.16406: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16474: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16552: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 30529 1726882588.16556: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 30529 1726882588.16576: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 30529 1726882588.16588: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16636: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16672: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30529 1726882588.16689: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16731: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16777: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16833: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.16903: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30529 1726882588.16936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.17015: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1340a330> <<< 30529 1726882588.17051: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea134052e0> <<< 30529 1726882588.17082: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30529 1726882588.17095: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17161: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17222: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17250: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17298: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30529 1726882588.17319: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30529 1726882588.17338: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30529 1726882588.17361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30529 1726882588.17417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30529 1726882588.17442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 30529 1726882588.17453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30529 1726882588.17512: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ba2c90> <<< 30529 1726882588.17552: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13bb2960> <<< 30529 1726882588.17627: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1340a4b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135f9850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 30529 1726882588.17644: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17675: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17708: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30529 1726882588.17758: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30529 1726882588.17773: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17797: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 30529 1726882588.17815: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.17940: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.18130: stdout chunk (state=3): >>># zipimport: zlib available <<< 30529 1726882588.18250: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 30529 1726882588.18255: stdout chunk (state=3): >>># destroy __main__ <<< 30529 1726882588.18538: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 30529 1726882588.18541: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 30529 1726882588.18544: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 30529 1726882588.18568: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 30529 1726882588.18597: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib <<< 30529 1726882588.18603: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math <<< 30529 1726882588.18630: stdout chunk (state=3): >>># cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random <<< 30529 1726882588.18647: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path <<< 30529 1726882588.18654: stdout chunk (state=3): >>># cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp<<< 30529 1726882588.18659: stdout chunk (state=3): >>> # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 30529 1726882588.18698: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 30529 1726882588.18703: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon <<< 30529 1726882588.18709: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian <<< 30529 1726882588.18733: stdout chunk (state=3): >>># cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters <<< 30529 1726882588.18744: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 30529 1726882588.18761: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 30529 1726882588.19020: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30529 1726882588.19027: stdout chunk (state=3): >>># destroy _bz2 <<< 30529 1726882588.19032: stdout chunk (state=3): >>># destroy _compression # destroy _lzma <<< 30529 1726882588.19057: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 30529 1726882588.19060: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 30529 1726882588.19098: stdout chunk (state=3): >>># destroy ntpath <<< 30529 1726882588.19110: stdout chunk (state=3): >>># destroy importlib <<< 30529 1726882588.19131: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 30529 1726882588.19140: stdout chunk (state=3): >>># destroy _locale # destroy pwd<<< 30529 1726882588.19159: stdout chunk (state=3): >>> # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 30529 1726882588.19175: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 30529 1726882588.19183: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 30529 1726882588.19199: stdout chunk (state=3): >>># destroy array <<< 30529 1726882588.19216: stdout chunk (state=3): >>># destroy datetime <<< 30529 1726882588.19232: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 30529 1726882588.19272: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 30529 1726882588.19286: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 30529 1726882588.19304: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 30529 1726882588.19319: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 30529 1726882588.19337: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 30529 1726882588.19359: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 30529 1726882588.19373: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 30529 1726882588.19378: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 30529 1726882588.19390: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 30529 1726882588.19406: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 30529 1726882588.19420: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 30529 1726882588.19424: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 30529 1726882588.19445: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30529 1726882588.19579: stdout chunk (state=3): >>># destroy sys.monitoring <<< 30529 1726882588.19591: stdout chunk (state=3): >>># destroy _socket <<< 30529 1726882588.19598: stdout chunk (state=3): >>># destroy _collections <<< 30529 1726882588.19628: stdout chunk (state=3): >>># destroy platform <<< 30529 1726882588.19631: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30529 1726882588.19653: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 30529 1726882588.19659: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 30529 1726882588.19680: stdout chunk (state=3): >>># destroy _typing <<< 30529 1726882588.19689: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 30529 1726882588.19710: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 30529 1726882588.19721: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30529 1726882588.19817: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 30529 1726882588.19820: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 30529 1726882588.19823: stdout chunk (state=3): >>># destroy time <<< 30529 1726882588.19854: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 30529 1726882588.19867: stdout chunk (state=3): >>># destroy _hashlib <<< 30529 1726882588.19881: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 30529 1726882588.19895: stdout chunk (state=3): >>># destroy itertools <<< 30529 1726882588.19916: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 30529 1726882588.19923: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 30529 1726882588.20253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882588.20276: stderr chunk (state=3): >>><<< 30529 1726882588.20280: stdout chunk (state=3): >>><<< 30529 1726882588.20347: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13fcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dcbe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dcbef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e03860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e03ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de3b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de1220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc9010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e223c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13de20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dca8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e58c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e58b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e58ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13dc6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e59580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e59250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e70680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e71d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e72bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e73230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e72120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13e73cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e733e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c07b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c30380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c30f80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13c318b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c30830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c05d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c32c90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c319d0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13e5aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c5b020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c7f380> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce0110> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce2870> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ce0230> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13cad130> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13aed220> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c7e180> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13c33bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fea13c7e780> # zipimport: found 30 names in '/tmp/ansible_stat_payload_uva5jb2f/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3ef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b1de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b1d010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3ce30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6e8d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6e660> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6df70> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6e9c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b3f9b0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6f650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b6f890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13b6fdd0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1350db20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1350f740> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13510140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135112e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13513d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13b1ef60> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13512030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351bcb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351a780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351a4e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1351aa50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13512540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13563fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135640e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13565b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13565940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea13568170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13566270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13568320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356c710> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356c7d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356ca10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13564290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135f8230> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135f9700> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356e9f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1356fd70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1356e630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea135fd970> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135fe7b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135f9550> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135fe4b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135ff9b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea1340a330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea134052e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13ba2c90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea13bb2960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea1340a4b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea135f9850> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30529 1726882588.20838: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882588.20841: _low_level_execute_command(): starting 30529 1726882588.20843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882587.6203003-30580-7876301810454/ > /dev/null 2>&1 && sleep 0' 30529 1726882588.21022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.21025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882588.21027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882588.21036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.21038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882588.21040: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.21042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882588.21045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.21086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882588.21089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882588.21091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.21138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.22924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882588.22948: stderr chunk (state=3): >>><<< 30529 1726882588.22951: stdout chunk (state=3): >>><<< 30529 1726882588.22968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882588.22974: handler run complete 30529 1726882588.22991: attempt loop complete, returning result 30529 1726882588.22995: _execute() done 30529 1726882588.22998: dumping result to json 30529 1726882588.23000: done dumping result, returning 30529 1726882588.23005: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [12673a56-9f93-b0f1-edc0-00000000002e] 30529 1726882588.23009: sending task result for task 12673a56-9f93-b0f1-edc0-00000000002e 30529 1726882588.23095: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000002e 30529 1726882588.23098: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882588.23204: no more pending results, returning what we have 30529 1726882588.23213: results queue empty 30529 1726882588.23214: checking for any_errors_fatal 30529 1726882588.23217: done checking for any_errors_fatal 30529 1726882588.23218: checking for max_fail_percentage 30529 1726882588.23219: done checking for max_fail_percentage 30529 1726882588.23220: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.23221: done checking to see if all hosts have failed 30529 1726882588.23222: getting the remaining hosts for this loop 30529 1726882588.23223: done getting the remaining hosts for this loop 30529 1726882588.23227: getting the next task for host managed_node1 30529 1726882588.23233: done getting next task for host managed_node1 30529 1726882588.23235: ^ task is: TASK: Set flag to indicate system is ostree 30529 1726882588.23237: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.23240: getting variables 30529 1726882588.23241: in VariableManager get_vars() 30529 1726882588.23266: Calling all_inventory to load vars for managed_node1 30529 1726882588.23268: Calling groups_inventory to load vars for managed_node1 30529 1726882588.23271: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.23283: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.23288: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.23291: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.23422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.23537: done with get_vars() 30529 1726882588.23544: done getting variables 30529 1726882588.23619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:36:28 -0400 (0:00:00.714) 0:00:02.262 ****** 30529 1726882588.23640: entering _queue_task() for managed_node1/set_fact 30529 1726882588.23641: Creating lock for set_fact 30529 1726882588.23839: worker is 1 (out of 1 available) 30529 1726882588.23851: exiting _queue_task() for managed_node1/set_fact 30529 1726882588.23864: done queuing things up, now waiting for results queue to drain 30529 1726882588.23865: waiting for pending results... 30529 1726882588.24001: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 30529 1726882588.24059: in run() - task 12673a56-9f93-b0f1-edc0-00000000002f 30529 1726882588.24076: variable 'ansible_search_path' from source: unknown 30529 1726882588.24079: variable 'ansible_search_path' from source: unknown 30529 1726882588.24112: calling self._execute() 30529 1726882588.24162: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.24165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.24173: variable 'omit' from source: magic vars 30529 1726882588.24506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882588.24697: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882588.24728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882588.24754: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882588.24780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882588.24845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882588.24865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882588.24883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882588.24906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882588.24997: Evaluated conditional (not __network_is_ostree is defined): True 30529 1726882588.25001: variable 'omit' from source: magic vars 30529 1726882588.25024: variable 'omit' from source: magic vars 30529 1726882588.25105: variable '__ostree_booted_stat' from source: set_fact 30529 1726882588.25140: variable 'omit' from source: magic vars 30529 1726882588.25157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882588.25179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882588.25197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882588.25210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.25220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.25241: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882588.25244: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.25247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.25318: Set connection var ansible_shell_executable to /bin/sh 30529 1726882588.25321: Set connection var ansible_pipelining to False 30529 1726882588.25324: Set connection var ansible_shell_type to sh 30529 1726882588.25331: Set connection var ansible_timeout to 10 30529 1726882588.25334: Set connection var ansible_connection to ssh 30529 1726882588.25338: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882588.25354: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.25357: variable 'ansible_connection' from source: unknown 30529 1726882588.25360: variable 'ansible_module_compression' from source: unknown 30529 1726882588.25362: variable 'ansible_shell_type' from source: unknown 30529 1726882588.25364: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.25366: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.25369: variable 'ansible_pipelining' from source: unknown 30529 1726882588.25372: variable 'ansible_timeout' from source: unknown 30529 1726882588.25376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.25446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882588.25454: variable 'omit' from source: magic vars 30529 1726882588.25458: starting attempt loop 30529 1726882588.25461: running the handler 30529 1726882588.25470: handler run complete 30529 1726882588.25477: attempt loop complete, returning result 30529 1726882588.25480: _execute() done 30529 1726882588.25482: dumping result to json 30529 1726882588.25485: done dumping result, returning 30529 1726882588.25494: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-00000000002f] 30529 1726882588.25499: sending task result for task 12673a56-9f93-b0f1-edc0-00000000002f 30529 1726882588.25572: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000002f 30529 1726882588.25574: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 30529 1726882588.25656: no more pending results, returning what we have 30529 1726882588.25658: results queue empty 30529 1726882588.25659: checking for any_errors_fatal 30529 1726882588.25663: done checking for any_errors_fatal 30529 1726882588.25663: checking for max_fail_percentage 30529 1726882588.25665: done checking for max_fail_percentage 30529 1726882588.25665: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.25666: done checking to see if all hosts have failed 30529 1726882588.25667: getting the remaining hosts for this loop 30529 1726882588.25668: done getting the remaining hosts for this loop 30529 1726882588.25671: getting the next task for host managed_node1 30529 1726882588.25677: done getting next task for host managed_node1 30529 1726882588.25680: ^ task is: TASK: Fix CentOS6 Base repo 30529 1726882588.25682: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.25685: getting variables 30529 1726882588.25686: in VariableManager get_vars() 30529 1726882588.25710: Calling all_inventory to load vars for managed_node1 30529 1726882588.25713: Calling groups_inventory to load vars for managed_node1 30529 1726882588.25715: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.25723: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.25726: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.25734: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.25831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.25961: done with get_vars() 30529 1726882588.25967: done getting variables 30529 1726882588.26048: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:36:28 -0400 (0:00:00.024) 0:00:02.286 ****** 30529 1726882588.26068: entering _queue_task() for managed_node1/copy 30529 1726882588.26241: worker is 1 (out of 1 available) 30529 1726882588.26253: exiting _queue_task() for managed_node1/copy 30529 1726882588.26264: done queuing things up, now waiting for results queue to drain 30529 1726882588.26265: waiting for pending results... 30529 1726882588.26392: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 30529 1726882588.26448: in run() - task 12673a56-9f93-b0f1-edc0-000000000031 30529 1726882588.26459: variable 'ansible_search_path' from source: unknown 30529 1726882588.26462: variable 'ansible_search_path' from source: unknown 30529 1726882588.26486: calling self._execute() 30529 1726882588.26539: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.26542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.26550: variable 'omit' from source: magic vars 30529 1726882588.26853: variable 'ansible_distribution' from source: facts 30529 1726882588.26868: Evaluated conditional (ansible_distribution == 'CentOS'): True 30529 1726882588.26948: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.26951: Evaluated conditional (ansible_distribution_major_version == '6'): False 30529 1726882588.26954: when evaluation is False, skipping this task 30529 1726882588.26957: _execute() done 30529 1726882588.26959: dumping result to json 30529 1726882588.26961: done dumping result, returning 30529 1726882588.26966: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [12673a56-9f93-b0f1-edc0-000000000031] 30529 1726882588.26971: sending task result for task 12673a56-9f93-b0f1-edc0-000000000031 30529 1726882588.27055: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000031 30529 1726882588.27058: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30529 1726882588.27108: no more pending results, returning what we have 30529 1726882588.27111: results queue empty 30529 1726882588.27111: checking for any_errors_fatal 30529 1726882588.27115: done checking for any_errors_fatal 30529 1726882588.27115: checking for max_fail_percentage 30529 1726882588.27117: done checking for max_fail_percentage 30529 1726882588.27117: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.27118: done checking to see if all hosts have failed 30529 1726882588.27119: getting the remaining hosts for this loop 30529 1726882588.27120: done getting the remaining hosts for this loop 30529 1726882588.27123: getting the next task for host managed_node1 30529 1726882588.27127: done getting next task for host managed_node1 30529 1726882588.27130: ^ task is: TASK: Include the task 'enable_epel.yml' 30529 1726882588.27132: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.27135: getting variables 30529 1726882588.27136: in VariableManager get_vars() 30529 1726882588.27156: Calling all_inventory to load vars for managed_node1 30529 1726882588.27158: Calling groups_inventory to load vars for managed_node1 30529 1726882588.27161: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.27170: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.27172: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.27174: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.27275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.27386: done with get_vars() 30529 1726882588.27392: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:36:28 -0400 (0:00:00.013) 0:00:02.300 ****** 30529 1726882588.27451: entering _queue_task() for managed_node1/include_tasks 30529 1726882588.27611: worker is 1 (out of 1 available) 30529 1726882588.27624: exiting _queue_task() for managed_node1/include_tasks 30529 1726882588.27635: done queuing things up, now waiting for results queue to drain 30529 1726882588.27637: waiting for pending results... 30529 1726882588.27753: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 30529 1726882588.27809: in run() - task 12673a56-9f93-b0f1-edc0-000000000032 30529 1726882588.27818: variable 'ansible_search_path' from source: unknown 30529 1726882588.27821: variable 'ansible_search_path' from source: unknown 30529 1726882588.27844: calling self._execute() 30529 1726882588.27897: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.27900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.27908: variable 'omit' from source: magic vars 30529 1726882588.28249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882588.29715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882588.29766: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882588.29795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882588.29821: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882588.29841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882588.29897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882588.29917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882588.29934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882588.29963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882588.29974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882588.30053: variable '__network_is_ostree' from source: set_fact 30529 1726882588.30062: Evaluated conditional (not __network_is_ostree | d(false)): True 30529 1726882588.30067: _execute() done 30529 1726882588.30070: dumping result to json 30529 1726882588.30072: done dumping result, returning 30529 1726882588.30078: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-b0f1-edc0-000000000032] 30529 1726882588.30081: sending task result for task 12673a56-9f93-b0f1-edc0-000000000032 30529 1726882588.30157: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000032 30529 1726882588.30160: WORKER PROCESS EXITING 30529 1726882588.30183: no more pending results, returning what we have 30529 1726882588.30187: in VariableManager get_vars() 30529 1726882588.30218: Calling all_inventory to load vars for managed_node1 30529 1726882588.30220: Calling groups_inventory to load vars for managed_node1 30529 1726882588.30223: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.30232: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.30235: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.30238: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.30379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.30486: done with get_vars() 30529 1726882588.30491: variable 'ansible_search_path' from source: unknown 30529 1726882588.30492: variable 'ansible_search_path' from source: unknown 30529 1726882588.30516: we have included files to process 30529 1726882588.30517: generating all_blocks data 30529 1726882588.30519: done generating all_blocks data 30529 1726882588.30522: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30529 1726882588.30523: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30529 1726882588.30525: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30529 1726882588.30963: done processing included file 30529 1726882588.30965: iterating over new_blocks loaded from include file 30529 1726882588.30965: in VariableManager get_vars() 30529 1726882588.30973: done with get_vars() 30529 1726882588.30974: filtering new block on tags 30529 1726882588.30987: done filtering new block on tags 30529 1726882588.30989: in VariableManager get_vars() 30529 1726882588.30997: done with get_vars() 30529 1726882588.30998: filtering new block on tags 30529 1726882588.31005: done filtering new block on tags 30529 1726882588.31007: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 30529 1726882588.31010: extending task lists for all hosts with included blocks 30529 1726882588.31066: done extending task lists 30529 1726882588.31067: done processing included files 30529 1726882588.31067: results queue empty 30529 1726882588.31068: checking for any_errors_fatal 30529 1726882588.31069: done checking for any_errors_fatal 30529 1726882588.31070: checking for max_fail_percentage 30529 1726882588.31070: done checking for max_fail_percentage 30529 1726882588.31071: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.31071: done checking to see if all hosts have failed 30529 1726882588.31072: getting the remaining hosts for this loop 30529 1726882588.31073: done getting the remaining hosts for this loop 30529 1726882588.31074: getting the next task for host managed_node1 30529 1726882588.31077: done getting next task for host managed_node1 30529 1726882588.31078: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 30529 1726882588.31080: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.31081: getting variables 30529 1726882588.31082: in VariableManager get_vars() 30529 1726882588.31088: Calling all_inventory to load vars for managed_node1 30529 1726882588.31089: Calling groups_inventory to load vars for managed_node1 30529 1726882588.31090: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.31095: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.31100: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.31102: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.31268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.31376: done with get_vars() 30529 1726882588.31382: done getting variables 30529 1726882588.31428: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 30529 1726882588.31544: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:36:28 -0400 (0:00:00.041) 0:00:02.341 ****** 30529 1726882588.31571: entering _queue_task() for managed_node1/command 30529 1726882588.31572: Creating lock for command 30529 1726882588.31743: worker is 1 (out of 1 available) 30529 1726882588.31755: exiting _queue_task() for managed_node1/command 30529 1726882588.31767: done queuing things up, now waiting for results queue to drain 30529 1726882588.31768: waiting for pending results... 30529 1726882588.31896: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 30529 1726882588.31963: in run() - task 12673a56-9f93-b0f1-edc0-00000000004c 30529 1726882588.31973: variable 'ansible_search_path' from source: unknown 30529 1726882588.31976: variable 'ansible_search_path' from source: unknown 30529 1726882588.32006: calling self._execute() 30529 1726882588.32054: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.32057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.32066: variable 'omit' from source: magic vars 30529 1726882588.32304: variable 'ansible_distribution' from source: facts 30529 1726882588.32312: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30529 1726882588.32396: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.32399: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30529 1726882588.32403: when evaluation is False, skipping this task 30529 1726882588.32406: _execute() done 30529 1726882588.32408: dumping result to json 30529 1726882588.32414: done dumping result, returning 30529 1726882588.32420: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [12673a56-9f93-b0f1-edc0-00000000004c] 30529 1726882588.32425: sending task result for task 12673a56-9f93-b0f1-edc0-00000000004c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30529 1726882588.32556: no more pending results, returning what we have 30529 1726882588.32558: results queue empty 30529 1726882588.32559: checking for any_errors_fatal 30529 1726882588.32560: done checking for any_errors_fatal 30529 1726882588.32561: checking for max_fail_percentage 30529 1726882588.32562: done checking for max_fail_percentage 30529 1726882588.32562: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.32563: done checking to see if all hosts have failed 30529 1726882588.32564: getting the remaining hosts for this loop 30529 1726882588.32565: done getting the remaining hosts for this loop 30529 1726882588.32568: getting the next task for host managed_node1 30529 1726882588.32572: done getting next task for host managed_node1 30529 1726882588.32575: ^ task is: TASK: Install yum-utils package 30529 1726882588.32577: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.32580: getting variables 30529 1726882588.32581: in VariableManager get_vars() 30529 1726882588.32605: Calling all_inventory to load vars for managed_node1 30529 1726882588.32607: Calling groups_inventory to load vars for managed_node1 30529 1726882588.32610: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.32618: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.32621: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.32632: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.32780: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000004c 30529 1726882588.32784: WORKER PROCESS EXITING 30529 1726882588.32796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.32906: done with get_vars() 30529 1726882588.32911: done getting variables 30529 1726882588.32972: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:36:28 -0400 (0:00:00.014) 0:00:02.356 ****** 30529 1726882588.32990: entering _queue_task() for managed_node1/package 30529 1726882588.32991: Creating lock for package 30529 1726882588.33153: worker is 1 (out of 1 available) 30529 1726882588.33166: exiting _queue_task() for managed_node1/package 30529 1726882588.33176: done queuing things up, now waiting for results queue to drain 30529 1726882588.33178: waiting for pending results... 30529 1726882588.33296: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 30529 1726882588.33358: in run() - task 12673a56-9f93-b0f1-edc0-00000000004d 30529 1726882588.33368: variable 'ansible_search_path' from source: unknown 30529 1726882588.33371: variable 'ansible_search_path' from source: unknown 30529 1726882588.33397: calling self._execute() 30529 1726882588.33441: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.33444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.33453: variable 'omit' from source: magic vars 30529 1726882588.33679: variable 'ansible_distribution' from source: facts 30529 1726882588.33688: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30529 1726882588.33772: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.33776: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30529 1726882588.33779: when evaluation is False, skipping this task 30529 1726882588.33781: _execute() done 30529 1726882588.33784: dumping result to json 30529 1726882588.33786: done dumping result, returning 30529 1726882588.33797: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [12673a56-9f93-b0f1-edc0-00000000004d] 30529 1726882588.33800: sending task result for task 12673a56-9f93-b0f1-edc0-00000000004d 30529 1726882588.33875: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000004d 30529 1726882588.33878: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30529 1726882588.33920: no more pending results, returning what we have 30529 1726882588.33922: results queue empty 30529 1726882588.33923: checking for any_errors_fatal 30529 1726882588.33927: done checking for any_errors_fatal 30529 1726882588.33927: checking for max_fail_percentage 30529 1726882588.33928: done checking for max_fail_percentage 30529 1726882588.33929: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.33930: done checking to see if all hosts have failed 30529 1726882588.33930: getting the remaining hosts for this loop 30529 1726882588.33932: done getting the remaining hosts for this loop 30529 1726882588.33934: getting the next task for host managed_node1 30529 1726882588.33939: done getting next task for host managed_node1 30529 1726882588.33941: ^ task is: TASK: Enable EPEL 7 30529 1726882588.33944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.33946: getting variables 30529 1726882588.33948: in VariableManager get_vars() 30529 1726882588.33968: Calling all_inventory to load vars for managed_node1 30529 1726882588.33969: Calling groups_inventory to load vars for managed_node1 30529 1726882588.33972: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.33979: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.33982: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.33985: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.34079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.34186: done with get_vars() 30529 1726882588.34194: done getting variables 30529 1726882588.34230: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:36:28 -0400 (0:00:00.012) 0:00:02.368 ****** 30529 1726882588.34246: entering _queue_task() for managed_node1/command 30529 1726882588.34398: worker is 1 (out of 1 available) 30529 1726882588.34410: exiting _queue_task() for managed_node1/command 30529 1726882588.34422: done queuing things up, now waiting for results queue to drain 30529 1726882588.34423: waiting for pending results... 30529 1726882588.34595: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 30529 1726882588.34689: in run() - task 12673a56-9f93-b0f1-edc0-00000000004e 30529 1726882588.34692: variable 'ansible_search_path' from source: unknown 30529 1726882588.34697: variable 'ansible_search_path' from source: unknown 30529 1726882588.34714: calling self._execute() 30529 1726882588.34777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.34797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.34813: variable 'omit' from source: magic vars 30529 1726882588.35233: variable 'ansible_distribution' from source: facts 30529 1726882588.35398: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30529 1726882588.35401: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.35403: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30529 1726882588.35405: when evaluation is False, skipping this task 30529 1726882588.35408: _execute() done 30529 1726882588.35410: dumping result to json 30529 1726882588.35411: done dumping result, returning 30529 1726882588.35413: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [12673a56-9f93-b0f1-edc0-00000000004e] 30529 1726882588.35415: sending task result for task 12673a56-9f93-b0f1-edc0-00000000004e skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30529 1726882588.35568: no more pending results, returning what we have 30529 1726882588.35572: results queue empty 30529 1726882588.35572: checking for any_errors_fatal 30529 1726882588.35578: done checking for any_errors_fatal 30529 1726882588.35579: checking for max_fail_percentage 30529 1726882588.35581: done checking for max_fail_percentage 30529 1726882588.35581: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.35582: done checking to see if all hosts have failed 30529 1726882588.35583: getting the remaining hosts for this loop 30529 1726882588.35585: done getting the remaining hosts for this loop 30529 1726882588.35589: getting the next task for host managed_node1 30529 1726882588.35597: done getting next task for host managed_node1 30529 1726882588.35599: ^ task is: TASK: Enable EPEL 8 30529 1726882588.35603: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.35607: getting variables 30529 1726882588.35610: in VariableManager get_vars() 30529 1726882588.35638: Calling all_inventory to load vars for managed_node1 30529 1726882588.35641: Calling groups_inventory to load vars for managed_node1 30529 1726882588.35645: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.35658: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.35661: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.35664: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.35933: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000004e 30529 1726882588.35938: WORKER PROCESS EXITING 30529 1726882588.35955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.36069: done with get_vars() 30529 1726882588.36075: done getting variables 30529 1726882588.36117: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:36:28 -0400 (0:00:00.018) 0:00:02.387 ****** 30529 1726882588.36137: entering _queue_task() for managed_node1/command 30529 1726882588.36304: worker is 1 (out of 1 available) 30529 1726882588.36315: exiting _queue_task() for managed_node1/command 30529 1726882588.36325: done queuing things up, now waiting for results queue to drain 30529 1726882588.36327: waiting for pending results... 30529 1726882588.36463: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 30529 1726882588.36526: in run() - task 12673a56-9f93-b0f1-edc0-00000000004f 30529 1726882588.36535: variable 'ansible_search_path' from source: unknown 30529 1726882588.36538: variable 'ansible_search_path' from source: unknown 30529 1726882588.36563: calling self._execute() 30529 1726882588.36613: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.36616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.36624: variable 'omit' from source: magic vars 30529 1726882588.36861: variable 'ansible_distribution' from source: facts 30529 1726882588.36873: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30529 1726882588.36958: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.36962: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30529 1726882588.36964: when evaluation is False, skipping this task 30529 1726882588.36967: _execute() done 30529 1726882588.36970: dumping result to json 30529 1726882588.36975: done dumping result, returning 30529 1726882588.36981: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [12673a56-9f93-b0f1-edc0-00000000004f] 30529 1726882588.36985: sending task result for task 12673a56-9f93-b0f1-edc0-00000000004f 30529 1726882588.37064: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000004f 30529 1726882588.37067: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30529 1726882588.37129: no more pending results, returning what we have 30529 1726882588.37132: results queue empty 30529 1726882588.37132: checking for any_errors_fatal 30529 1726882588.37135: done checking for any_errors_fatal 30529 1726882588.37136: checking for max_fail_percentage 30529 1726882588.37137: done checking for max_fail_percentage 30529 1726882588.37138: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.37139: done checking to see if all hosts have failed 30529 1726882588.37139: getting the remaining hosts for this loop 30529 1726882588.37141: done getting the remaining hosts for this loop 30529 1726882588.37143: getting the next task for host managed_node1 30529 1726882588.37150: done getting next task for host managed_node1 30529 1726882588.37152: ^ task is: TASK: Enable EPEL 6 30529 1726882588.37155: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.37158: getting variables 30529 1726882588.37159: in VariableManager get_vars() 30529 1726882588.37179: Calling all_inventory to load vars for managed_node1 30529 1726882588.37181: Calling groups_inventory to load vars for managed_node1 30529 1726882588.37184: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.37191: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.37192: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.37196: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.37289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.37400: done with get_vars() 30529 1726882588.37407: done getting variables 30529 1726882588.37441: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:36:28 -0400 (0:00:00.013) 0:00:02.400 ****** 30529 1726882588.37458: entering _queue_task() for managed_node1/copy 30529 1726882588.37609: worker is 1 (out of 1 available) 30529 1726882588.37621: exiting _queue_task() for managed_node1/copy 30529 1726882588.37632: done queuing things up, now waiting for results queue to drain 30529 1726882588.37634: waiting for pending results... 30529 1726882588.37759: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 30529 1726882588.37900: in run() - task 12673a56-9f93-b0f1-edc0-000000000051 30529 1726882588.37903: variable 'ansible_search_path' from source: unknown 30529 1726882588.37906: variable 'ansible_search_path' from source: unknown 30529 1726882588.37909: calling self._execute() 30529 1726882588.38022: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.38026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.38028: variable 'omit' from source: magic vars 30529 1726882588.38787: variable 'ansible_distribution' from source: facts 30529 1726882588.38954: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30529 1726882588.38997: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.39003: Evaluated conditional (ansible_distribution_major_version == '6'): False 30529 1726882588.39006: when evaluation is False, skipping this task 30529 1726882588.39009: _execute() done 30529 1726882588.39011: dumping result to json 30529 1726882588.39014: done dumping result, returning 30529 1726882588.39021: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [12673a56-9f93-b0f1-edc0-000000000051] 30529 1726882588.39026: sending task result for task 12673a56-9f93-b0f1-edc0-000000000051 30529 1726882588.39121: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000051 30529 1726882588.39125: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30529 1726882588.39210: no more pending results, returning what we have 30529 1726882588.39214: results queue empty 30529 1726882588.39215: checking for any_errors_fatal 30529 1726882588.39221: done checking for any_errors_fatal 30529 1726882588.39222: checking for max_fail_percentage 30529 1726882588.39224: done checking for max_fail_percentage 30529 1726882588.39224: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.39226: done checking to see if all hosts have failed 30529 1726882588.39227: getting the remaining hosts for this loop 30529 1726882588.39228: done getting the remaining hosts for this loop 30529 1726882588.39232: getting the next task for host managed_node1 30529 1726882588.39242: done getting next task for host managed_node1 30529 1726882588.39245: ^ task is: TASK: Set network provider to 'nm' 30529 1726882588.39247: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.39251: getting variables 30529 1726882588.39253: in VariableManager get_vars() 30529 1726882588.39279: Calling all_inventory to load vars for managed_node1 30529 1726882588.39282: Calling groups_inventory to load vars for managed_node1 30529 1726882588.39286: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.39299: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.39302: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.39306: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.39729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.40015: done with get_vars() 30529 1726882588.40023: done getting variables 30529 1726882588.40073: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 21:36:28 -0400 (0:00:00.026) 0:00:02.427 ****** 30529 1726882588.40119: entering _queue_task() for managed_node1/set_fact 30529 1726882588.40344: worker is 1 (out of 1 available) 30529 1726882588.40357: exiting _queue_task() for managed_node1/set_fact 30529 1726882588.40367: done queuing things up, now waiting for results queue to drain 30529 1726882588.40368: waiting for pending results... 30529 1726882588.40719: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 30529 1726882588.40724: in run() - task 12673a56-9f93-b0f1-edc0-000000000007 30529 1726882588.40727: variable 'ansible_search_path' from source: unknown 30529 1726882588.40730: calling self._execute() 30529 1726882588.40757: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.40762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.40771: variable 'omit' from source: magic vars 30529 1726882588.40868: variable 'omit' from source: magic vars 30529 1726882588.40899: variable 'omit' from source: magic vars 30529 1726882588.40935: variable 'omit' from source: magic vars 30529 1726882588.40976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882588.41021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882588.41141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882588.41144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.41146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.41148: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882588.41150: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.41152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.41233: Set connection var ansible_shell_executable to /bin/sh 30529 1726882588.41245: Set connection var ansible_pipelining to False 30529 1726882588.41254: Set connection var ansible_shell_type to sh 30529 1726882588.41267: Set connection var ansible_timeout to 10 30529 1726882588.41272: Set connection var ansible_connection to ssh 30529 1726882588.41283: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882588.41359: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.41362: variable 'ansible_connection' from source: unknown 30529 1726882588.41364: variable 'ansible_module_compression' from source: unknown 30529 1726882588.41366: variable 'ansible_shell_type' from source: unknown 30529 1726882588.41368: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.41370: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.41372: variable 'ansible_pipelining' from source: unknown 30529 1726882588.41374: variable 'ansible_timeout' from source: unknown 30529 1726882588.41376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.41483: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882588.41503: variable 'omit' from source: magic vars 30529 1726882588.41512: starting attempt loop 30529 1726882588.41518: running the handler 30529 1726882588.41530: handler run complete 30529 1726882588.41543: attempt loop complete, returning result 30529 1726882588.41575: _execute() done 30529 1726882588.41578: dumping result to json 30529 1726882588.41580: done dumping result, returning 30529 1726882588.41582: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [12673a56-9f93-b0f1-edc0-000000000007] 30529 1726882588.41584: sending task result for task 12673a56-9f93-b0f1-edc0-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 30529 1726882588.41729: no more pending results, returning what we have 30529 1726882588.41732: results queue empty 30529 1726882588.41733: checking for any_errors_fatal 30529 1726882588.41739: done checking for any_errors_fatal 30529 1726882588.41740: checking for max_fail_percentage 30529 1726882588.41742: done checking for max_fail_percentage 30529 1726882588.41742: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.41743: done checking to see if all hosts have failed 30529 1726882588.41744: getting the remaining hosts for this loop 30529 1726882588.41745: done getting the remaining hosts for this loop 30529 1726882588.41749: getting the next task for host managed_node1 30529 1726882588.41755: done getting next task for host managed_node1 30529 1726882588.41757: ^ task is: TASK: meta (flush_handlers) 30529 1726882588.41758: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.41762: getting variables 30529 1726882588.41764: in VariableManager get_vars() 30529 1726882588.41791: Calling all_inventory to load vars for managed_node1 30529 1726882588.41795: Calling groups_inventory to load vars for managed_node1 30529 1726882588.41799: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.41809: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.41813: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.41815: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.42370: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000007 30529 1726882588.42374: WORKER PROCESS EXITING 30529 1726882588.42398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.42783: done with get_vars() 30529 1726882588.42795: done getting variables 30529 1726882588.42854: in VariableManager get_vars() 30529 1726882588.42862: Calling all_inventory to load vars for managed_node1 30529 1726882588.42864: Calling groups_inventory to load vars for managed_node1 30529 1726882588.42866: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.42870: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.42872: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.42874: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.43231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.43449: done with get_vars() 30529 1726882588.43463: done queuing things up, now waiting for results queue to drain 30529 1726882588.43465: results queue empty 30529 1726882588.43466: checking for any_errors_fatal 30529 1726882588.43468: done checking for any_errors_fatal 30529 1726882588.43469: checking for max_fail_percentage 30529 1726882588.43470: done checking for max_fail_percentage 30529 1726882588.43471: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.43471: done checking to see if all hosts have failed 30529 1726882588.43472: getting the remaining hosts for this loop 30529 1726882588.43473: done getting the remaining hosts for this loop 30529 1726882588.43475: getting the next task for host managed_node1 30529 1726882588.43479: done getting next task for host managed_node1 30529 1726882588.43481: ^ task is: TASK: meta (flush_handlers) 30529 1726882588.43482: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.43492: getting variables 30529 1726882588.43495: in VariableManager get_vars() 30529 1726882588.43503: Calling all_inventory to load vars for managed_node1 30529 1726882588.43505: Calling groups_inventory to load vars for managed_node1 30529 1726882588.43507: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.43511: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.43514: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.43516: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.43649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.43829: done with get_vars() 30529 1726882588.43836: done getting variables 30529 1726882588.43875: in VariableManager get_vars() 30529 1726882588.43882: Calling all_inventory to load vars for managed_node1 30529 1726882588.43884: Calling groups_inventory to load vars for managed_node1 30529 1726882588.43889: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.43894: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.43897: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.43900: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.44051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.44259: done with get_vars() 30529 1726882588.44269: done queuing things up, now waiting for results queue to drain 30529 1726882588.44271: results queue empty 30529 1726882588.44272: checking for any_errors_fatal 30529 1726882588.44273: done checking for any_errors_fatal 30529 1726882588.44273: checking for max_fail_percentage 30529 1726882588.44274: done checking for max_fail_percentage 30529 1726882588.44275: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.44276: done checking to see if all hosts have failed 30529 1726882588.44276: getting the remaining hosts for this loop 30529 1726882588.44279: done getting the remaining hosts for this loop 30529 1726882588.44281: getting the next task for host managed_node1 30529 1726882588.44284: done getting next task for host managed_node1 30529 1726882588.44285: ^ task is: None 30529 1726882588.44288: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.44289: done queuing things up, now waiting for results queue to drain 30529 1726882588.44290: results queue empty 30529 1726882588.44290: checking for any_errors_fatal 30529 1726882588.44291: done checking for any_errors_fatal 30529 1726882588.44292: checking for max_fail_percentage 30529 1726882588.44295: done checking for max_fail_percentage 30529 1726882588.44296: checking to see if all hosts have failed and the running result is not ok 30529 1726882588.44297: done checking to see if all hosts have failed 30529 1726882588.44299: getting the next task for host managed_node1 30529 1726882588.44301: done getting next task for host managed_node1 30529 1726882588.44302: ^ task is: None 30529 1726882588.44303: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.44346: in VariableManager get_vars() 30529 1726882588.44360: done with get_vars() 30529 1726882588.44366: in VariableManager get_vars() 30529 1726882588.44375: done with get_vars() 30529 1726882588.44379: variable 'omit' from source: magic vars 30529 1726882588.44426: in VariableManager get_vars() 30529 1726882588.44436: done with get_vars() 30529 1726882588.44454: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 30529 1726882588.44764: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 30529 1726882588.44791: getting the remaining hosts for this loop 30529 1726882588.44792: done getting the remaining hosts for this loop 30529 1726882588.44797: getting the next task for host managed_node1 30529 1726882588.44799: done getting next task for host managed_node1 30529 1726882588.44801: ^ task is: TASK: Gathering Facts 30529 1726882588.44802: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882588.44804: getting variables 30529 1726882588.44805: in VariableManager get_vars() 30529 1726882588.44812: Calling all_inventory to load vars for managed_node1 30529 1726882588.44815: Calling groups_inventory to load vars for managed_node1 30529 1726882588.44817: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882588.44821: Calling all_plugins_play to load vars for managed_node1 30529 1726882588.44833: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882588.44836: Calling groups_plugins_play to load vars for managed_node1 30529 1726882588.44971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882588.45181: done with get_vars() 30529 1726882588.45192: done getting variables 30529 1726882588.45233: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 21:36:28 -0400 (0:00:00.051) 0:00:02.478 ****** 30529 1726882588.45268: entering _queue_task() for managed_node1/gather_facts 30529 1726882588.45474: worker is 1 (out of 1 available) 30529 1726882588.45483: exiting _queue_task() for managed_node1/gather_facts 30529 1726882588.45500: done queuing things up, now waiting for results queue to drain 30529 1726882588.45502: waiting for pending results... 30529 1726882588.45743: running TaskExecutor() for managed_node1/TASK: Gathering Facts 30529 1726882588.45888: in run() - task 12673a56-9f93-b0f1-edc0-000000000077 30529 1726882588.45892: variable 'ansible_search_path' from source: unknown 30529 1726882588.45899: calling self._execute() 30529 1726882588.45972: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.45982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.46002: variable 'omit' from source: magic vars 30529 1726882588.46498: variable 'ansible_distribution_major_version' from source: facts 30529 1726882588.46502: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882588.46504: variable 'omit' from source: magic vars 30529 1726882588.46506: variable 'omit' from source: magic vars 30529 1726882588.46509: variable 'omit' from source: magic vars 30529 1726882588.46511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882588.46528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882588.46552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882588.46573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.46592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882588.46628: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882588.46637: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.46644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.46751: Set connection var ansible_shell_executable to /bin/sh 30529 1726882588.46763: Set connection var ansible_pipelining to False 30529 1726882588.46771: Set connection var ansible_shell_type to sh 30529 1726882588.46788: Set connection var ansible_timeout to 10 30529 1726882588.46798: Set connection var ansible_connection to ssh 30529 1726882588.46808: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882588.46832: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.46843: variable 'ansible_connection' from source: unknown 30529 1726882588.46850: variable 'ansible_module_compression' from source: unknown 30529 1726882588.46856: variable 'ansible_shell_type' from source: unknown 30529 1726882588.46861: variable 'ansible_shell_executable' from source: unknown 30529 1726882588.46867: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882588.46950: variable 'ansible_pipelining' from source: unknown 30529 1726882588.46954: variable 'ansible_timeout' from source: unknown 30529 1726882588.46956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882588.47064: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882588.47078: variable 'omit' from source: magic vars 30529 1726882588.47090: starting attempt loop 30529 1726882588.47099: running the handler 30529 1726882588.47117: variable 'ansible_facts' from source: unknown 30529 1726882588.47140: _low_level_execute_command(): starting 30529 1726882588.47154: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882588.47877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882588.47896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.47949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882588.48044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882588.48065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.48152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.49830: stdout chunk (state=3): >>>/root <<< 30529 1726882588.49988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882588.49991: stdout chunk (state=3): >>><<< 30529 1726882588.49999: stderr chunk (state=3): >>><<< 30529 1726882588.50125: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882588.50128: _low_level_execute_command(): starting 30529 1726882588.50131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947 `" && echo ansible-tmp-1726882588.5002215-30619-11714319622947="` echo /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947 `" ) && sleep 0' 30529 1726882588.51204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882588.51209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.51276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882588.51280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.51437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882588.51476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.51525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.53492: stdout chunk (state=3): >>>ansible-tmp-1726882588.5002215-30619-11714319622947=/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947 <<< 30529 1726882588.53521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882588.53562: stderr chunk (state=3): >>><<< 30529 1726882588.53565: stdout chunk (state=3): >>><<< 30529 1726882588.53590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882588.5002215-30619-11714319622947=/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882588.53899: variable 'ansible_module_compression' from source: unknown 30529 1726882588.53902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30529 1726882588.53904: variable 'ansible_facts' from source: unknown 30529 1726882588.54348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py 30529 1726882588.54673: Sending initial data 30529 1726882588.54682: Sent initial data (153 bytes) 30529 1726882588.55802: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882588.55852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.55868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882588.56070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.56104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882588.56150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.56185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.57760: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882588.57815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882588.57876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9a3pddsm /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py <<< 30529 1726882588.57886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py" <<< 30529 1726882588.57946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9a3pddsm" to remote "/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py" <<< 30529 1726882588.60784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882588.60940: stderr chunk (state=3): >>><<< 30529 1726882588.60943: stdout chunk (state=3): >>><<< 30529 1726882588.60946: done transferring module to remote 30529 1726882588.60948: _low_level_execute_command(): starting 30529 1726882588.60950: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/ /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py && sleep 0' 30529 1726882588.61895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.61899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882588.61901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.61903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882588.61914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.61975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882588.61978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882588.62423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.62709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882588.64332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882588.64359: stderr chunk (state=3): >>><<< 30529 1726882588.64369: stdout chunk (state=3): >>><<< 30529 1726882588.64390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882588.64573: _low_level_execute_command(): starting 30529 1726882588.64577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/AnsiballZ_setup.py && sleep 0' 30529 1726882588.65907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882588.65910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882588.65912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.65914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882588.65916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882588.65918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882588.66210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882588.66239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.29742: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_local": {}, "ansible_loadavg": {"1m": 0.578125, "5m": 0.42529296875, "15m": 0.234375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "28", "epoch": "1726882588", "epoch_int": "1726882588", "date": "2024-09-20", "time": "21:36:28", "iso8601_micro": "2024-09-21T01:36:28.933847Z", "iso8601": "2024-09-21T01:36:28Z", "iso8601_basic": "20240920T213628933847", "iso8601_basic_short": "20240920T213628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2967, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 564, "free": 2967}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_ve<<< 30529 1726882589.29777: stdout chunk (state=3): >>>rsion": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1022, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789810688, "block_size": 4096, "block_total": 65519099, "block_available": 63913528, "block_used": 1605571, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_v<<< 30529 1726882589.29789: stdout chunk (state=3): >>>lan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off <<< 30529 1726882589.29791: stdout chunk (state=3): >>>[fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30529 1726882589.31717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882589.31792: stderr chunk (state=3): >>><<< 30529 1726882589.31798: stdout chunk (state=3): >>><<< 30529 1726882589.31883: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_local": {}, "ansible_loadavg": {"1m": 0.578125, "5m": 0.42529296875, "15m": 0.234375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "28", "epoch": "1726882588", "epoch_int": "1726882588", "date": "2024-09-20", "time": "21:36:28", "iso8601_micro": "2024-09-21T01:36:28.933847Z", "iso8601": "2024-09-21T01:36:28Z", "iso8601_basic": "20240920T213628933847", "iso8601_basic_short": "20240920T213628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2967, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 564, "free": 2967}, "nocache": {"free": 3307, "used": 224}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1022, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789810688, "block_size": 4096, "block_total": 65519099, "block_available": 63913528, "block_used": 1605571, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882589.32110: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882589.32137: _low_level_execute_command(): starting 30529 1726882589.32140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882588.5002215-30619-11714319622947/ > /dev/null 2>&1 && sleep 0' 30529 1726882589.32562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882589.32565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882589.32568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882589.32570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882589.32572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.32621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882589.32624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.32671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.34482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882589.34485: stdout chunk (state=3): >>><<< 30529 1726882589.34487: stderr chunk (state=3): >>><<< 30529 1726882589.34504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882589.34699: handler run complete 30529 1726882589.34702: variable 'ansible_facts' from source: unknown 30529 1726882589.34743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.35041: variable 'ansible_facts' from source: unknown 30529 1726882589.35127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.35257: attempt loop complete, returning result 30529 1726882589.35266: _execute() done 30529 1726882589.35272: dumping result to json 30529 1726882589.35306: done dumping result, returning 30529 1726882589.35317: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-b0f1-edc0-000000000077] 30529 1726882589.35324: sending task result for task 12673a56-9f93-b0f1-edc0-000000000077 ok: [managed_node1] 30529 1726882589.35953: no more pending results, returning what we have 30529 1726882589.35956: results queue empty 30529 1726882589.35957: checking for any_errors_fatal 30529 1726882589.35958: done checking for any_errors_fatal 30529 1726882589.35958: checking for max_fail_percentage 30529 1726882589.35960: done checking for max_fail_percentage 30529 1726882589.35961: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.35962: done checking to see if all hosts have failed 30529 1726882589.35962: getting the remaining hosts for this loop 30529 1726882589.35963: done getting the remaining hosts for this loop 30529 1726882589.35967: getting the next task for host managed_node1 30529 1726882589.35972: done getting next task for host managed_node1 30529 1726882589.35973: ^ task is: TASK: meta (flush_handlers) 30529 1726882589.35975: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.35978: getting variables 30529 1726882589.35980: in VariableManager get_vars() 30529 1726882589.36002: Calling all_inventory to load vars for managed_node1 30529 1726882589.36004: Calling groups_inventory to load vars for managed_node1 30529 1726882589.36007: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.36014: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000077 30529 1726882589.36017: WORKER PROCESS EXITING 30529 1726882589.36026: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.36029: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.36032: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.36187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.36378: done with get_vars() 30529 1726882589.36388: done getting variables 30529 1726882589.36456: in VariableManager get_vars() 30529 1726882589.36464: Calling all_inventory to load vars for managed_node1 30529 1726882589.36466: Calling groups_inventory to load vars for managed_node1 30529 1726882589.36469: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.36473: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.36475: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.36478: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.36626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.36810: done with get_vars() 30529 1726882589.36823: done queuing things up, now waiting for results queue to drain 30529 1726882589.36824: results queue empty 30529 1726882589.36825: checking for any_errors_fatal 30529 1726882589.36828: done checking for any_errors_fatal 30529 1726882589.36829: checking for max_fail_percentage 30529 1726882589.36830: done checking for max_fail_percentage 30529 1726882589.36835: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.36835: done checking to see if all hosts have failed 30529 1726882589.36836: getting the remaining hosts for this loop 30529 1726882589.36837: done getting the remaining hosts for this loop 30529 1726882589.36839: getting the next task for host managed_node1 30529 1726882589.36843: done getting next task for host managed_node1 30529 1726882589.36845: ^ task is: TASK: Show playbook name 30529 1726882589.36846: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.36848: getting variables 30529 1726882589.36849: in VariableManager get_vars() 30529 1726882589.36856: Calling all_inventory to load vars for managed_node1 30529 1726882589.36858: Calling groups_inventory to load vars for managed_node1 30529 1726882589.36860: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.36864: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.36867: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.36869: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.37002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.37184: done with get_vars() 30529 1726882589.37191: done getting variables 30529 1726882589.37263: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 21:36:29 -0400 (0:00:00.920) 0:00:03.398 ****** 30529 1726882589.37288: entering _queue_task() for managed_node1/debug 30529 1726882589.37290: Creating lock for debug 30529 1726882589.37537: worker is 1 (out of 1 available) 30529 1726882589.37548: exiting _queue_task() for managed_node1/debug 30529 1726882589.37560: done queuing things up, now waiting for results queue to drain 30529 1726882589.37561: waiting for pending results... 30529 1726882589.37778: running TaskExecutor() for managed_node1/TASK: Show playbook name 30529 1726882589.37861: in run() - task 12673a56-9f93-b0f1-edc0-00000000000b 30529 1726882589.37881: variable 'ansible_search_path' from source: unknown 30529 1726882589.37931: calling self._execute() 30529 1726882589.38010: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.38023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.38039: variable 'omit' from source: magic vars 30529 1726882589.38402: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.38423: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.38440: variable 'omit' from source: magic vars 30529 1726882589.38470: variable 'omit' from source: magic vars 30529 1726882589.38515: variable 'omit' from source: magic vars 30529 1726882589.38559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882589.38604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.38629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882589.38651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.38670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.38710: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.38720: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.38727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.38837: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.38849: Set connection var ansible_pipelining to False 30529 1726882589.38857: Set connection var ansible_shell_type to sh 30529 1726882589.38871: Set connection var ansible_timeout to 10 30529 1726882589.38878: Set connection var ansible_connection to ssh 30529 1726882589.38998: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.39001: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.39004: variable 'ansible_connection' from source: unknown 30529 1726882589.39006: variable 'ansible_module_compression' from source: unknown 30529 1726882589.39008: variable 'ansible_shell_type' from source: unknown 30529 1726882589.39010: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.39011: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.39013: variable 'ansible_pipelining' from source: unknown 30529 1726882589.39014: variable 'ansible_timeout' from source: unknown 30529 1726882589.39016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.39089: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.39108: variable 'omit' from source: magic vars 30529 1726882589.39120: starting attempt loop 30529 1726882589.39132: running the handler 30529 1726882589.39180: handler run complete 30529 1726882589.39209: attempt loop complete, returning result 30529 1726882589.39217: _execute() done 30529 1726882589.39224: dumping result to json 30529 1726882589.39234: done dumping result, returning 30529 1726882589.39250: done running TaskExecutor() for managed_node1/TASK: Show playbook name [12673a56-9f93-b0f1-edc0-00000000000b] 30529 1726882589.39259: sending task result for task 12673a56-9f93-b0f1-edc0-00000000000b ok: [managed_node1] => {} MSG: this is: playbooks/tests_states.yml 30529 1726882589.39397: no more pending results, returning what we have 30529 1726882589.39400: results queue empty 30529 1726882589.39401: checking for any_errors_fatal 30529 1726882589.39403: done checking for any_errors_fatal 30529 1726882589.39403: checking for max_fail_percentage 30529 1726882589.39405: done checking for max_fail_percentage 30529 1726882589.39406: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.39407: done checking to see if all hosts have failed 30529 1726882589.39408: getting the remaining hosts for this loop 30529 1726882589.39410: done getting the remaining hosts for this loop 30529 1726882589.39413: getting the next task for host managed_node1 30529 1726882589.39421: done getting next task for host managed_node1 30529 1726882589.39424: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882589.39426: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.39430: getting variables 30529 1726882589.39431: in VariableManager get_vars() 30529 1726882589.39458: Calling all_inventory to load vars for managed_node1 30529 1726882589.39460: Calling groups_inventory to load vars for managed_node1 30529 1726882589.39464: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.39475: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.39478: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.39481: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.39889: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000000b 30529 1726882589.39892: WORKER PROCESS EXITING 30529 1726882589.39915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.40097: done with get_vars() 30529 1726882589.40106: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 21:36:29 -0400 (0:00:00.028) 0:00:03.427 ****** 30529 1726882589.40185: entering _queue_task() for managed_node1/include_tasks 30529 1726882589.40601: worker is 1 (out of 1 available) 30529 1726882589.40607: exiting _queue_task() for managed_node1/include_tasks 30529 1726882589.40616: done queuing things up, now waiting for results queue to drain 30529 1726882589.40617: waiting for pending results... 30529 1726882589.40648: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882589.40729: in run() - task 12673a56-9f93-b0f1-edc0-00000000000d 30529 1726882589.40750: variable 'ansible_search_path' from source: unknown 30529 1726882589.40784: calling self._execute() 30529 1726882589.40900: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.40903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.40906: variable 'omit' from source: magic vars 30529 1726882589.41210: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.41228: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.41238: _execute() done 30529 1726882589.41247: dumping result to json 30529 1726882589.41255: done dumping result, returning 30529 1726882589.41266: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-00000000000d] 30529 1726882589.41389: sending task result for task 12673a56-9f93-b0f1-edc0-00000000000d 30529 1726882589.41463: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000000d 30529 1726882589.41466: WORKER PROCESS EXITING 30529 1726882589.41516: no more pending results, returning what we have 30529 1726882589.41521: in VariableManager get_vars() 30529 1726882589.41549: Calling all_inventory to load vars for managed_node1 30529 1726882589.41551: Calling groups_inventory to load vars for managed_node1 30529 1726882589.41555: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.41566: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.41568: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.41571: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.41810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.41996: done with get_vars() 30529 1726882589.42003: variable 'ansible_search_path' from source: unknown 30529 1726882589.42015: we have included files to process 30529 1726882589.42017: generating all_blocks data 30529 1726882589.42018: done generating all_blocks data 30529 1726882589.42019: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882589.42020: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882589.42023: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882589.42538: in VariableManager get_vars() 30529 1726882589.42554: done with get_vars() 30529 1726882589.42592: in VariableManager get_vars() 30529 1726882589.42608: done with get_vars() 30529 1726882589.42644: in VariableManager get_vars() 30529 1726882589.42657: done with get_vars() 30529 1726882589.42695: in VariableManager get_vars() 30529 1726882589.42709: done with get_vars() 30529 1726882589.42757: in VariableManager get_vars() 30529 1726882589.42770: done with get_vars() 30529 1726882589.43092: in VariableManager get_vars() 30529 1726882589.43108: done with get_vars() 30529 1726882589.43119: done processing included file 30529 1726882589.43121: iterating over new_blocks loaded from include file 30529 1726882589.43122: in VariableManager get_vars() 30529 1726882589.43131: done with get_vars() 30529 1726882589.43133: filtering new block on tags 30529 1726882589.43226: done filtering new block on tags 30529 1726882589.43229: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882589.43233: extending task lists for all hosts with included blocks 30529 1726882589.43264: done extending task lists 30529 1726882589.43265: done processing included files 30529 1726882589.43266: results queue empty 30529 1726882589.43267: checking for any_errors_fatal 30529 1726882589.43270: done checking for any_errors_fatal 30529 1726882589.43271: checking for max_fail_percentage 30529 1726882589.43272: done checking for max_fail_percentage 30529 1726882589.43272: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.43274: done checking to see if all hosts have failed 30529 1726882589.43274: getting the remaining hosts for this loop 30529 1726882589.43276: done getting the remaining hosts for this loop 30529 1726882589.43279: getting the next task for host managed_node1 30529 1726882589.43283: done getting next task for host managed_node1 30529 1726882589.43285: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882589.43287: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.43289: getting variables 30529 1726882589.43290: in VariableManager get_vars() 30529 1726882589.43299: Calling all_inventory to load vars for managed_node1 30529 1726882589.43301: Calling groups_inventory to load vars for managed_node1 30529 1726882589.43303: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.43308: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.43310: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.43313: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.43623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.43805: done with get_vars() 30529 1726882589.43813: done getting variables 30529 1726882589.43848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882589.43953: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:36:29 -0400 (0:00:00.038) 0:00:03.465 ****** 30529 1726882589.43988: entering _queue_task() for managed_node1/debug 30529 1726882589.44210: worker is 1 (out of 1 available) 30529 1726882589.44221: exiting _queue_task() for managed_node1/debug 30529 1726882589.44233: done queuing things up, now waiting for results queue to drain 30529 1726882589.44234: waiting for pending results... 30529 1726882589.44463: running TaskExecutor() for managed_node1/TASK: TEST: I can create a profile 30529 1726882589.44580: in run() - task 12673a56-9f93-b0f1-edc0-000000000091 30529 1726882589.44583: variable 'ansible_search_path' from source: unknown 30529 1726882589.44585: variable 'ansible_search_path' from source: unknown 30529 1726882589.44615: calling self._execute() 30529 1726882589.44696: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.44798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.44802: variable 'omit' from source: magic vars 30529 1726882589.45083: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.45104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.45115: variable 'omit' from source: magic vars 30529 1726882589.45157: variable 'omit' from source: magic vars 30529 1726882589.45264: variable 'lsr_description' from source: include params 30529 1726882589.45285: variable 'omit' from source: magic vars 30529 1726882589.45326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882589.45372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.45400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882589.45424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.45441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.45481: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.45569: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.45572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.45609: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.45622: Set connection var ansible_pipelining to False 30529 1726882589.45630: Set connection var ansible_shell_type to sh 30529 1726882589.45646: Set connection var ansible_timeout to 10 30529 1726882589.45653: Set connection var ansible_connection to ssh 30529 1726882589.45663: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.45697: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.45706: variable 'ansible_connection' from source: unknown 30529 1726882589.45714: variable 'ansible_module_compression' from source: unknown 30529 1726882589.45798: variable 'ansible_shell_type' from source: unknown 30529 1726882589.45801: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.45804: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.45806: variable 'ansible_pipelining' from source: unknown 30529 1726882589.45808: variable 'ansible_timeout' from source: unknown 30529 1726882589.45810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.45882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.45902: variable 'omit' from source: magic vars 30529 1726882589.45915: starting attempt loop 30529 1726882589.45922: running the handler 30529 1726882589.45970: handler run complete 30529 1726882589.45997: attempt loop complete, returning result 30529 1726882589.46000: _execute() done 30529 1726882589.46003: dumping result to json 30529 1726882589.46202: done dumping result, returning 30529 1726882589.46205: done running TaskExecutor() for managed_node1/TASK: TEST: I can create a profile [12673a56-9f93-b0f1-edc0-000000000091] 30529 1726882589.46208: sending task result for task 12673a56-9f93-b0f1-edc0-000000000091 30529 1726882589.46265: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000091 30529 1726882589.46268: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I can create a profile ########## 30529 1726882589.46306: no more pending results, returning what we have 30529 1726882589.46310: results queue empty 30529 1726882589.46311: checking for any_errors_fatal 30529 1726882589.46312: done checking for any_errors_fatal 30529 1726882589.46312: checking for max_fail_percentage 30529 1726882589.46314: done checking for max_fail_percentage 30529 1726882589.46315: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.46315: done checking to see if all hosts have failed 30529 1726882589.46316: getting the remaining hosts for this loop 30529 1726882589.46317: done getting the remaining hosts for this loop 30529 1726882589.46321: getting the next task for host managed_node1 30529 1726882589.46326: done getting next task for host managed_node1 30529 1726882589.46328: ^ task is: TASK: Show item 30529 1726882589.46331: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.46334: getting variables 30529 1726882589.46336: in VariableManager get_vars() 30529 1726882589.46358: Calling all_inventory to load vars for managed_node1 30529 1726882589.46360: Calling groups_inventory to load vars for managed_node1 30529 1726882589.46363: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.46372: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.46375: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.46378: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.46585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.46776: done with get_vars() 30529 1726882589.46785: done getting variables 30529 1726882589.46838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:36:29 -0400 (0:00:00.028) 0:00:03.494 ****** 30529 1726882589.46862: entering _queue_task() for managed_node1/debug 30529 1726882589.47069: worker is 1 (out of 1 available) 30529 1726882589.47079: exiting _queue_task() for managed_node1/debug 30529 1726882589.47091: done queuing things up, now waiting for results queue to drain 30529 1726882589.47094: waiting for pending results... 30529 1726882589.47309: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882589.47403: in run() - task 12673a56-9f93-b0f1-edc0-000000000092 30529 1726882589.47425: variable 'ansible_search_path' from source: unknown 30529 1726882589.47433: variable 'ansible_search_path' from source: unknown 30529 1726882589.47500: variable 'omit' from source: magic vars 30529 1726882589.47610: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.47632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.47698: variable 'omit' from source: magic vars 30529 1726882589.48030: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.48046: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.48062: variable 'omit' from source: magic vars 30529 1726882589.48103: variable 'omit' from source: magic vars 30529 1726882589.48152: variable 'item' from source: unknown 30529 1726882589.48230: variable 'item' from source: unknown 30529 1726882589.48248: variable 'omit' from source: magic vars 30529 1726882589.48289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882589.48329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.48388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882589.48391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.48395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.48420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.48428: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.48434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.48527: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.48538: Set connection var ansible_pipelining to False 30529 1726882589.48606: Set connection var ansible_shell_type to sh 30529 1726882589.48609: Set connection var ansible_timeout to 10 30529 1726882589.48611: Set connection var ansible_connection to ssh 30529 1726882589.48613: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.48616: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.48618: variable 'ansible_connection' from source: unknown 30529 1726882589.48620: variable 'ansible_module_compression' from source: unknown 30529 1726882589.48622: variable 'ansible_shell_type' from source: unknown 30529 1726882589.48624: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.48626: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.48627: variable 'ansible_pipelining' from source: unknown 30529 1726882589.48630: variable 'ansible_timeout' from source: unknown 30529 1726882589.48632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.48754: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.48772: variable 'omit' from source: magic vars 30529 1726882589.48782: starting attempt loop 30529 1726882589.48788: running the handler 30529 1726882589.48838: variable 'lsr_description' from source: include params 30529 1726882589.48902: variable 'lsr_description' from source: include params 30529 1726882589.48930: handler run complete 30529 1726882589.48940: attempt loop complete, returning result 30529 1726882589.48958: variable 'item' from source: unknown 30529 1726882589.49039: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 30529 1726882589.49400: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.49403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.49405: variable 'omit' from source: magic vars 30529 1726882589.49407: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.49409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.49411: variable 'omit' from source: magic vars 30529 1726882589.49412: variable 'omit' from source: magic vars 30529 1726882589.49448: variable 'item' from source: unknown 30529 1726882589.49515: variable 'item' from source: unknown 30529 1726882589.49535: variable 'omit' from source: magic vars 30529 1726882589.49555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.49566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.49576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.49590: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.49616: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.49619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.49681: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.49691: Set connection var ansible_pipelining to False 30529 1726882589.49725: Set connection var ansible_shell_type to sh 30529 1726882589.49727: Set connection var ansible_timeout to 10 30529 1726882589.49730: Set connection var ansible_connection to ssh 30529 1726882589.49732: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.49747: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.49753: variable 'ansible_connection' from source: unknown 30529 1726882589.49759: variable 'ansible_module_compression' from source: unknown 30529 1726882589.49833: variable 'ansible_shell_type' from source: unknown 30529 1726882589.49836: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.49838: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.49840: variable 'ansible_pipelining' from source: unknown 30529 1726882589.49842: variable 'ansible_timeout' from source: unknown 30529 1726882589.49844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.49880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.49896: variable 'omit' from source: magic vars 30529 1726882589.49905: starting attempt loop 30529 1726882589.49911: running the handler 30529 1726882589.49933: variable 'lsr_setup' from source: include params 30529 1726882589.50003: variable 'lsr_setup' from source: include params 30529 1726882589.50052: handler run complete 30529 1726882589.50069: attempt loop complete, returning result 30529 1726882589.50087: variable 'item' from source: unknown 30529 1726882589.50149: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30529 1726882589.50361: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.50364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.50366: variable 'omit' from source: magic vars 30529 1726882589.50481: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.50484: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.50486: variable 'omit' from source: magic vars 30529 1726882589.50498: variable 'omit' from source: magic vars 30529 1726882589.50539: variable 'item' from source: unknown 30529 1726882589.50699: variable 'item' from source: unknown 30529 1726882589.50702: variable 'omit' from source: magic vars 30529 1726882589.50704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.50706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.50708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.50710: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.50712: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.50714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.50748: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.50757: Set connection var ansible_pipelining to False 30529 1726882589.50763: Set connection var ansible_shell_type to sh 30529 1726882589.50775: Set connection var ansible_timeout to 10 30529 1726882589.50780: Set connection var ansible_connection to ssh 30529 1726882589.50788: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.50815: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.50822: variable 'ansible_connection' from source: unknown 30529 1726882589.50828: variable 'ansible_module_compression' from source: unknown 30529 1726882589.50834: variable 'ansible_shell_type' from source: unknown 30529 1726882589.50839: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.50845: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.50852: variable 'ansible_pipelining' from source: unknown 30529 1726882589.50857: variable 'ansible_timeout' from source: unknown 30529 1726882589.50864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.50952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.50963: variable 'omit' from source: magic vars 30529 1726882589.50971: starting attempt loop 30529 1726882589.50977: running the handler 30529 1726882589.50999: variable 'lsr_test' from source: include params 30529 1726882589.51066: variable 'lsr_test' from source: include params 30529 1726882589.51132: handler run complete 30529 1726882589.51135: attempt loop complete, returning result 30529 1726882589.51138: variable 'item' from source: unknown 30529 1726882589.51181: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 30529 1726882589.51426: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.51429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.51431: variable 'omit' from source: magic vars 30529 1726882589.51484: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.51497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.51508: variable 'omit' from source: magic vars 30529 1726882589.51525: variable 'omit' from source: magic vars 30529 1726882589.51573: variable 'item' from source: unknown 30529 1726882589.51636: variable 'item' from source: unknown 30529 1726882589.51660: variable 'omit' from source: magic vars 30529 1726882589.51682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.51698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.51708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.51720: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.51726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.51763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.51800: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.51808: Set connection var ansible_pipelining to False 30529 1726882589.51814: Set connection var ansible_shell_type to sh 30529 1726882589.51824: Set connection var ansible_timeout to 10 30529 1726882589.51829: Set connection var ansible_connection to ssh 30529 1726882589.51836: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.51854: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.51862: variable 'ansible_connection' from source: unknown 30529 1726882589.51898: variable 'ansible_module_compression' from source: unknown 30529 1726882589.51901: variable 'ansible_shell_type' from source: unknown 30529 1726882589.51904: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.51906: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.51908: variable 'ansible_pipelining' from source: unknown 30529 1726882589.51910: variable 'ansible_timeout' from source: unknown 30529 1726882589.51912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.52087: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.52092: variable 'omit' from source: magic vars 30529 1726882589.52097: starting attempt loop 30529 1726882589.52099: running the handler 30529 1726882589.52101: variable 'lsr_assert' from source: include params 30529 1726882589.52110: variable 'lsr_assert' from source: include params 30529 1726882589.52130: handler run complete 30529 1726882589.52199: attempt loop complete, returning result 30529 1726882589.52202: variable 'item' from source: unknown 30529 1726882589.52232: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 30529 1726882589.52434: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.52437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.52440: variable 'omit' from source: magic vars 30529 1726882589.52545: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.52555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.52568: variable 'omit' from source: magic vars 30529 1726882589.52585: variable 'omit' from source: magic vars 30529 1726882589.52629: variable 'item' from source: unknown 30529 1726882589.52698: variable 'item' from source: unknown 30529 1726882589.52761: variable 'omit' from source: magic vars 30529 1726882589.52764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.52766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.52768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.52770: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.52776: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.52784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.52857: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.52872: Set connection var ansible_pipelining to False 30529 1726882589.52880: Set connection var ansible_shell_type to sh 30529 1726882589.52896: Set connection var ansible_timeout to 10 30529 1726882589.52978: Set connection var ansible_connection to ssh 30529 1726882589.52981: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.52984: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.52986: variable 'ansible_connection' from source: unknown 30529 1726882589.52988: variable 'ansible_module_compression' from source: unknown 30529 1726882589.52990: variable 'ansible_shell_type' from source: unknown 30529 1726882589.52991: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.52995: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.52997: variable 'ansible_pipelining' from source: unknown 30529 1726882589.52999: variable 'ansible_timeout' from source: unknown 30529 1726882589.53001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.53060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.53072: variable 'omit' from source: magic vars 30529 1726882589.53084: starting attempt loop 30529 1726882589.53097: running the handler 30529 1726882589.53117: variable 'lsr_assert_when' from source: include params 30529 1726882589.53182: variable 'lsr_assert_when' from source: include params 30529 1726882589.53272: variable 'network_provider' from source: set_fact 30529 1726882589.53313: handler run complete 30529 1726882589.53332: attempt loop complete, returning result 30529 1726882589.53398: variable 'item' from source: unknown 30529 1726882589.53416: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 30529 1726882589.53734: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.53737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.53740: variable 'omit' from source: magic vars 30529 1726882589.53775: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.53786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.53796: variable 'omit' from source: magic vars 30529 1726882589.53814: variable 'omit' from source: magic vars 30529 1726882589.53963: variable 'item' from source: unknown 30529 1726882589.53966: variable 'item' from source: unknown 30529 1726882589.53969: variable 'omit' from source: magic vars 30529 1726882589.53971: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.53986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.54000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.54015: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.54034: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.54041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.54117: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.54128: Set connection var ansible_pipelining to False 30529 1726882589.54135: Set connection var ansible_shell_type to sh 30529 1726882589.54149: Set connection var ansible_timeout to 10 30529 1726882589.54156: Set connection var ansible_connection to ssh 30529 1726882589.54166: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.54194: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.54204: variable 'ansible_connection' from source: unknown 30529 1726882589.54211: variable 'ansible_module_compression' from source: unknown 30529 1726882589.54218: variable 'ansible_shell_type' from source: unknown 30529 1726882589.54290: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.54295: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.54297: variable 'ansible_pipelining' from source: unknown 30529 1726882589.54300: variable 'ansible_timeout' from source: unknown 30529 1726882589.54302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.54340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.54352: variable 'omit' from source: magic vars 30529 1726882589.54360: starting attempt loop 30529 1726882589.54367: running the handler 30529 1726882589.54387: variable 'lsr_fail_debug' from source: play vars 30529 1726882589.54453: variable 'lsr_fail_debug' from source: play vars 30529 1726882589.54472: handler run complete 30529 1726882589.54489: attempt loop complete, returning result 30529 1726882589.54512: variable 'item' from source: unknown 30529 1726882589.54573: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882589.54833: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.54836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.54839: variable 'omit' from source: magic vars 30529 1726882589.54878: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.54889: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.54899: variable 'omit' from source: magic vars 30529 1726882589.54916: variable 'omit' from source: magic vars 30529 1726882589.54961: variable 'item' from source: unknown 30529 1726882589.55023: variable 'item' from source: unknown 30529 1726882589.55042: variable 'omit' from source: magic vars 30529 1726882589.55069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.55086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.55100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.55114: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.55121: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.55128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.55203: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.55215: Set connection var ansible_pipelining to False 30529 1726882589.55270: Set connection var ansible_shell_type to sh 30529 1726882589.55273: Set connection var ansible_timeout to 10 30529 1726882589.55275: Set connection var ansible_connection to ssh 30529 1726882589.55277: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.55280: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.55282: variable 'ansible_connection' from source: unknown 30529 1726882589.55284: variable 'ansible_module_compression' from source: unknown 30529 1726882589.55286: variable 'ansible_shell_type' from source: unknown 30529 1726882589.55294: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.55303: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.55311: variable 'ansible_pipelining' from source: unknown 30529 1726882589.55317: variable 'ansible_timeout' from source: unknown 30529 1726882589.55325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.55413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.55423: variable 'omit' from source: magic vars 30529 1726882589.55487: starting attempt loop 30529 1726882589.55490: running the handler 30529 1726882589.55492: variable 'lsr_cleanup' from source: include params 30529 1726882589.55536: variable 'lsr_cleanup' from source: include params 30529 1726882589.55554: handler run complete 30529 1726882589.55568: attempt loop complete, returning result 30529 1726882589.55584: variable 'item' from source: unknown 30529 1726882589.55650: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30529 1726882589.55829: dumping result to json 30529 1726882589.55831: done dumping result, returning 30529 1726882589.55834: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-000000000092] 30529 1726882589.55836: sending task result for task 12673a56-9f93-b0f1-edc0-000000000092 30529 1726882589.55875: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000092 30529 1726882589.55878: WORKER PROCESS EXITING 30529 1726882589.55940: no more pending results, returning what we have 30529 1726882589.55943: results queue empty 30529 1726882589.55944: checking for any_errors_fatal 30529 1726882589.55949: done checking for any_errors_fatal 30529 1726882589.55950: checking for max_fail_percentage 30529 1726882589.55951: done checking for max_fail_percentage 30529 1726882589.55952: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.55953: done checking to see if all hosts have failed 30529 1726882589.55953: getting the remaining hosts for this loop 30529 1726882589.55955: done getting the remaining hosts for this loop 30529 1726882589.55959: getting the next task for host managed_node1 30529 1726882589.55967: done getting next task for host managed_node1 30529 1726882589.55969: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882589.55972: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.55976: getting variables 30529 1726882589.55978: in VariableManager get_vars() 30529 1726882589.56002: Calling all_inventory to load vars for managed_node1 30529 1726882589.56005: Calling groups_inventory to load vars for managed_node1 30529 1726882589.56010: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.56022: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.56025: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.56028: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.56503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.56727: done with get_vars() 30529 1726882589.56736: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:36:29 -0400 (0:00:00.101) 0:00:03.596 ****** 30529 1726882589.57023: entering _queue_task() for managed_node1/include_tasks 30529 1726882589.57360: worker is 1 (out of 1 available) 30529 1726882589.57372: exiting _queue_task() for managed_node1/include_tasks 30529 1726882589.57384: done queuing things up, now waiting for results queue to drain 30529 1726882589.57385: waiting for pending results... 30529 1726882589.57908: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882589.58001: in run() - task 12673a56-9f93-b0f1-edc0-000000000093 30529 1726882589.58005: variable 'ansible_search_path' from source: unknown 30529 1726882589.58007: variable 'ansible_search_path' from source: unknown 30529 1726882589.58020: calling self._execute() 30529 1726882589.58099: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.58114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.58198: variable 'omit' from source: magic vars 30529 1726882589.58472: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.58495: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.58542: _execute() done 30529 1726882589.58545: dumping result to json 30529 1726882589.58547: done dumping result, returning 30529 1726882589.58549: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-000000000093] 30529 1726882589.58551: sending task result for task 12673a56-9f93-b0f1-edc0-000000000093 30529 1726882589.58714: no more pending results, returning what we have 30529 1726882589.58720: in VariableManager get_vars() 30529 1726882589.58751: Calling all_inventory to load vars for managed_node1 30529 1726882589.58758: Calling groups_inventory to load vars for managed_node1 30529 1726882589.58762: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.58775: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.58778: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.58781: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.59104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.59338: done with get_vars() 30529 1726882589.59344: variable 'ansible_search_path' from source: unknown 30529 1726882589.59346: variable 'ansible_search_path' from source: unknown 30529 1726882589.59359: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000093 30529 1726882589.59363: WORKER PROCESS EXITING 30529 1726882589.59396: we have included files to process 30529 1726882589.59398: generating all_blocks data 30529 1726882589.59400: done generating all_blocks data 30529 1726882589.59403: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882589.59404: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882589.59407: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882589.59554: in VariableManager get_vars() 30529 1726882589.59569: done with get_vars() 30529 1726882589.59669: done processing included file 30529 1726882589.59671: iterating over new_blocks loaded from include file 30529 1726882589.59672: in VariableManager get_vars() 30529 1726882589.59683: done with get_vars() 30529 1726882589.59684: filtering new block on tags 30529 1726882589.59730: done filtering new block on tags 30529 1726882589.59732: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882589.59737: extending task lists for all hosts with included blocks 30529 1726882589.60827: done extending task lists 30529 1726882589.60828: done processing included files 30529 1726882589.60829: results queue empty 30529 1726882589.60830: checking for any_errors_fatal 30529 1726882589.60835: done checking for any_errors_fatal 30529 1726882589.60836: checking for max_fail_percentage 30529 1726882589.60837: done checking for max_fail_percentage 30529 1726882589.60838: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.60839: done checking to see if all hosts have failed 30529 1726882589.60839: getting the remaining hosts for this loop 30529 1726882589.60841: done getting the remaining hosts for this loop 30529 1726882589.60843: getting the next task for host managed_node1 30529 1726882589.60848: done getting next task for host managed_node1 30529 1726882589.60850: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882589.60852: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.60855: getting variables 30529 1726882589.60856: in VariableManager get_vars() 30529 1726882589.60863: Calling all_inventory to load vars for managed_node1 30529 1726882589.60865: Calling groups_inventory to load vars for managed_node1 30529 1726882589.60868: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.60872: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.60874: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.60877: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.61248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.61639: done with get_vars() 30529 1726882589.61647: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:29 -0400 (0:00:00.046) 0:00:03.643 ****** 30529 1726882589.61716: entering _queue_task() for managed_node1/include_tasks 30529 1726882589.62147: worker is 1 (out of 1 available) 30529 1726882589.62158: exiting _queue_task() for managed_node1/include_tasks 30529 1726882589.62170: done queuing things up, now waiting for results queue to drain 30529 1726882589.62171: waiting for pending results... 30529 1726882589.62645: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882589.62811: in run() - task 12673a56-9f93-b0f1-edc0-0000000000ba 30529 1726882589.62836: variable 'ansible_search_path' from source: unknown 30529 1726882589.62844: variable 'ansible_search_path' from source: unknown 30529 1726882589.62882: calling self._execute() 30529 1726882589.63069: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.63080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.63096: variable 'omit' from source: magic vars 30529 1726882589.63772: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.63792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.63805: _execute() done 30529 1726882589.63816: dumping result to json 30529 1726882589.63824: done dumping result, returning 30529 1726882589.63834: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-0000000000ba] 30529 1726882589.63998: sending task result for task 12673a56-9f93-b0f1-edc0-0000000000ba 30529 1726882589.64081: no more pending results, returning what we have 30529 1726882589.64085: in VariableManager get_vars() 30529 1726882589.64117: Calling all_inventory to load vars for managed_node1 30529 1726882589.64120: Calling groups_inventory to load vars for managed_node1 30529 1726882589.64123: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.64136: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.64138: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.64141: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.64542: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000000ba 30529 1726882589.64546: WORKER PROCESS EXITING 30529 1726882589.64567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.64959: done with get_vars() 30529 1726882589.64965: variable 'ansible_search_path' from source: unknown 30529 1726882589.64967: variable 'ansible_search_path' from source: unknown 30529 1726882589.65202: we have included files to process 30529 1726882589.65204: generating all_blocks data 30529 1726882589.65205: done generating all_blocks data 30529 1726882589.65207: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882589.65208: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882589.65210: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882589.65696: done processing included file 30529 1726882589.65699: iterating over new_blocks loaded from include file 30529 1726882589.65700: in VariableManager get_vars() 30529 1726882589.65713: done with get_vars() 30529 1726882589.65714: filtering new block on tags 30529 1726882589.65746: done filtering new block on tags 30529 1726882589.65748: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882589.65752: extending task lists for all hosts with included blocks 30529 1726882589.66102: done extending task lists 30529 1726882589.66103: done processing included files 30529 1726882589.66104: results queue empty 30529 1726882589.66105: checking for any_errors_fatal 30529 1726882589.66107: done checking for any_errors_fatal 30529 1726882589.66108: checking for max_fail_percentage 30529 1726882589.66109: done checking for max_fail_percentage 30529 1726882589.66110: checking to see if all hosts have failed and the running result is not ok 30529 1726882589.66111: done checking to see if all hosts have failed 30529 1726882589.66111: getting the remaining hosts for this loop 30529 1726882589.66113: done getting the remaining hosts for this loop 30529 1726882589.66115: getting the next task for host managed_node1 30529 1726882589.66119: done getting next task for host managed_node1 30529 1726882589.66121: ^ task is: TASK: Gather current interface info 30529 1726882589.66124: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882589.66126: getting variables 30529 1726882589.66127: in VariableManager get_vars() 30529 1726882589.66134: Calling all_inventory to load vars for managed_node1 30529 1726882589.66136: Calling groups_inventory to load vars for managed_node1 30529 1726882589.66139: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882589.66143: Calling all_plugins_play to load vars for managed_node1 30529 1726882589.66145: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882589.66148: Calling groups_plugins_play to load vars for managed_node1 30529 1726882589.66512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882589.66854: done with get_vars() 30529 1726882589.66863: done getting variables 30529 1726882589.66902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:29 -0400 (0:00:00.052) 0:00:03.695 ****** 30529 1726882589.66928: entering _queue_task() for managed_node1/command 30529 1726882589.67158: worker is 1 (out of 1 available) 30529 1726882589.67169: exiting _queue_task() for managed_node1/command 30529 1726882589.67181: done queuing things up, now waiting for results queue to drain 30529 1726882589.67183: waiting for pending results... 30529 1726882589.67675: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882589.68099: in run() - task 12673a56-9f93-b0f1-edc0-0000000000f5 30529 1726882589.68104: variable 'ansible_search_path' from source: unknown 30529 1726882589.68107: variable 'ansible_search_path' from source: unknown 30529 1726882589.68109: calling self._execute() 30529 1726882589.68140: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.68228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.68243: variable 'omit' from source: magic vars 30529 1726882589.69020: variable 'ansible_distribution_major_version' from source: facts 30529 1726882589.69040: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882589.69050: variable 'omit' from source: magic vars 30529 1726882589.69104: variable 'omit' from source: magic vars 30529 1726882589.69299: variable 'omit' from source: magic vars 30529 1726882589.69344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882589.69392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882589.69502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882589.69526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.69544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882589.69574: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882589.69698: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.69702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.69907: Set connection var ansible_shell_executable to /bin/sh 30529 1726882589.69999: Set connection var ansible_pipelining to False 30529 1726882589.70003: Set connection var ansible_shell_type to sh 30529 1726882589.70005: Set connection var ansible_timeout to 10 30529 1726882589.70008: Set connection var ansible_connection to ssh 30529 1726882589.70010: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882589.70012: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.70014: variable 'ansible_connection' from source: unknown 30529 1726882589.70021: variable 'ansible_module_compression' from source: unknown 30529 1726882589.70023: variable 'ansible_shell_type' from source: unknown 30529 1726882589.70025: variable 'ansible_shell_executable' from source: unknown 30529 1726882589.70027: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882589.70028: variable 'ansible_pipelining' from source: unknown 30529 1726882589.70030: variable 'ansible_timeout' from source: unknown 30529 1726882589.70032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882589.70384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882589.70500: variable 'omit' from source: magic vars 30529 1726882589.70503: starting attempt loop 30529 1726882589.70507: running the handler 30529 1726882589.70510: _low_level_execute_command(): starting 30529 1726882589.70599: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882589.72001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882589.72018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882589.72242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.72246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882589.72265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882589.72292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.72539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.74114: stdout chunk (state=3): >>>/root <<< 30529 1726882589.74209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882589.74291: stderr chunk (state=3): >>><<< 30529 1726882589.74304: stdout chunk (state=3): >>><<< 30529 1726882589.74331: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882589.74358: _low_level_execute_command(): starting 30529 1726882589.74406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220 `" && echo ansible-tmp-1726882589.7434344-30666-240903545642220="` echo /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220 `" ) && sleep 0' 30529 1726882589.75152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.75234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882589.75272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.75374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.77206: stdout chunk (state=3): >>>ansible-tmp-1726882589.7434344-30666-240903545642220=/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220 <<< 30529 1726882589.77354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882589.77357: stdout chunk (state=3): >>><<< 30529 1726882589.77359: stderr chunk (state=3): >>><<< 30529 1726882589.77570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882589.7434344-30666-240903545642220=/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882589.77574: variable 'ansible_module_compression' from source: unknown 30529 1726882589.77583: ANSIBALLZ: Using generic lock for ansible.legacy.command 30529 1726882589.77596: ANSIBALLZ: Acquiring lock 30529 1726882589.77604: ANSIBALLZ: Lock acquired: 139794692461328 30529 1726882589.77611: ANSIBALLZ: Creating module 30529 1726882589.91843: ANSIBALLZ: Writing module into payload 30529 1726882589.91925: ANSIBALLZ: Writing module 30529 1726882589.91941: ANSIBALLZ: Renaming module 30529 1726882589.91947: ANSIBALLZ: Done creating module 30529 1726882589.91999: variable 'ansible_facts' from source: unknown 30529 1726882589.92035: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py 30529 1726882589.92351: Sending initial data 30529 1726882589.92364: Sent initial data (156 bytes) 30529 1726882589.92999: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882589.93025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.93068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.93149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882589.93210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882589.93214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.93523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.95148: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882589.95155: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882589.95194: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882589.95198: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30529 1726882589.95201: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30529 1726882589.95204: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882589.95247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882589.95290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgrgoy79s /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py <<< 30529 1726882589.95295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py" <<< 30529 1726882589.95329: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgrgoy79s" to remote "/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py" <<< 30529 1726882589.96214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882589.96217: stderr chunk (state=3): >>><<< 30529 1726882589.96219: stdout chunk (state=3): >>><<< 30529 1726882589.96221: done transferring module to remote 30529 1726882589.96223: _low_level_execute_command(): starting 30529 1726882589.96225: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/ /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py && sleep 0' 30529 1726882589.96891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882589.96950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.97006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882589.98810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882589.98820: stderr chunk (state=3): >>><<< 30529 1726882589.98823: stdout chunk (state=3): >>><<< 30529 1726882589.98842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882589.98846: _low_level_execute_command(): starting 30529 1726882589.98849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/AnsiballZ_command.py && sleep 0' 30529 1726882589.99251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882589.99254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.99258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882589.99261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882589.99263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882589.99306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882589.99313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882589.99363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.15150: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:30.146407", "end": "2024-09-20 21:36:30.149791", "delta": "0:00:00.003384", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882590.16612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882590.16654: stderr chunk (state=3): >>><<< 30529 1726882590.16761: stdout chunk (state=3): >>><<< 30529 1726882590.16765: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:30.146407", "end": "2024-09-20 21:36:30.149791", "delta": "0:00:00.003384", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882590.16768: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882590.16771: _low_level_execute_command(): starting 30529 1726882590.16773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882589.7434344-30666-240903545642220/ > /dev/null 2>&1 && sleep 0' 30529 1726882590.17556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882590.17570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882590.17611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.17719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.17741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.17755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.17947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.19899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.19902: stdout chunk (state=3): >>><<< 30529 1726882590.19904: stderr chunk (state=3): >>><<< 30529 1726882590.19907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.19909: handler run complete 30529 1726882590.19911: Evaluated conditional (False): False 30529 1726882590.19913: attempt loop complete, returning result 30529 1726882590.19914: _execute() done 30529 1726882590.19916: dumping result to json 30529 1726882590.19918: done dumping result, returning 30529 1726882590.19920: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-0000000000f5] 30529 1726882590.19921: sending task result for task 12673a56-9f93-b0f1-edc0-0000000000f5 30529 1726882590.20003: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000000f5 30529 1726882590.20006: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003384", "end": "2024-09-20 21:36:30.149791", "rc": 0, "start": "2024-09-20 21:36:30.146407" } STDOUT: bonding_masters eth0 lo 30529 1726882590.20083: no more pending results, returning what we have 30529 1726882590.20088: results queue empty 30529 1726882590.20090: checking for any_errors_fatal 30529 1726882590.20091: done checking for any_errors_fatal 30529 1726882590.20092: checking for max_fail_percentage 30529 1726882590.20095: done checking for max_fail_percentage 30529 1726882590.20096: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.20097: done checking to see if all hosts have failed 30529 1726882590.20098: getting the remaining hosts for this loop 30529 1726882590.20099: done getting the remaining hosts for this loop 30529 1726882590.20104: getting the next task for host managed_node1 30529 1726882590.20112: done getting next task for host managed_node1 30529 1726882590.20114: ^ task is: TASK: Set current_interfaces 30529 1726882590.20120: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.20123: getting variables 30529 1726882590.20126: in VariableManager get_vars() 30529 1726882590.20154: Calling all_inventory to load vars for managed_node1 30529 1726882590.20157: Calling groups_inventory to load vars for managed_node1 30529 1726882590.20161: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.20174: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.20177: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.20181: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.20570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.20944: done with get_vars() 30529 1726882590.20956: done getting variables 30529 1726882590.21032: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:30 -0400 (0:00:00.541) 0:00:04.236 ****** 30529 1726882590.21065: entering _queue_task() for managed_node1/set_fact 30529 1726882590.21356: worker is 1 (out of 1 available) 30529 1726882590.21367: exiting _queue_task() for managed_node1/set_fact 30529 1726882590.21379: done queuing things up, now waiting for results queue to drain 30529 1726882590.21381: waiting for pending results... 30529 1726882590.21739: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882590.21744: in run() - task 12673a56-9f93-b0f1-edc0-0000000000f6 30529 1726882590.21764: variable 'ansible_search_path' from source: unknown 30529 1726882590.21770: variable 'ansible_search_path' from source: unknown 30529 1726882590.21811: calling self._execute() 30529 1726882590.21891: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.21903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.21916: variable 'omit' from source: magic vars 30529 1726882590.22276: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.22298: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.22309: variable 'omit' from source: magic vars 30529 1726882590.22364: variable 'omit' from source: magic vars 30529 1726882590.22477: variable '_current_interfaces' from source: set_fact 30529 1726882590.22548: variable 'omit' from source: magic vars 30529 1726882590.22602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882590.22643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882590.22665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882590.22689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.22798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.22802: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882590.22804: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.22806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.22872: Set connection var ansible_shell_executable to /bin/sh 30529 1726882590.22883: Set connection var ansible_pipelining to False 30529 1726882590.22896: Set connection var ansible_shell_type to sh 30529 1726882590.22912: Set connection var ansible_timeout to 10 30529 1726882590.22926: Set connection var ansible_connection to ssh 30529 1726882590.22937: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882590.22960: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.22968: variable 'ansible_connection' from source: unknown 30529 1726882590.22974: variable 'ansible_module_compression' from source: unknown 30529 1726882590.22980: variable 'ansible_shell_type' from source: unknown 30529 1726882590.22988: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.22997: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.23004: variable 'ansible_pipelining' from source: unknown 30529 1726882590.23010: variable 'ansible_timeout' from source: unknown 30529 1726882590.23016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.23159: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882590.23251: variable 'omit' from source: magic vars 30529 1726882590.23254: starting attempt loop 30529 1726882590.23256: running the handler 30529 1726882590.23258: handler run complete 30529 1726882590.23260: attempt loop complete, returning result 30529 1726882590.23262: _execute() done 30529 1726882590.23264: dumping result to json 30529 1726882590.23266: done dumping result, returning 30529 1726882590.23268: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-0000000000f6] 30529 1726882590.23270: sending task result for task 12673a56-9f93-b0f1-edc0-0000000000f6 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882590.23411: no more pending results, returning what we have 30529 1726882590.23414: results queue empty 30529 1726882590.23415: checking for any_errors_fatal 30529 1726882590.23423: done checking for any_errors_fatal 30529 1726882590.23423: checking for max_fail_percentage 30529 1726882590.23425: done checking for max_fail_percentage 30529 1726882590.23426: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.23427: done checking to see if all hosts have failed 30529 1726882590.23427: getting the remaining hosts for this loop 30529 1726882590.23429: done getting the remaining hosts for this loop 30529 1726882590.23433: getting the next task for host managed_node1 30529 1726882590.23441: done getting next task for host managed_node1 30529 1726882590.23444: ^ task is: TASK: Show current_interfaces 30529 1726882590.23447: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.23453: getting variables 30529 1726882590.23454: in VariableManager get_vars() 30529 1726882590.23518: Calling all_inventory to load vars for managed_node1 30529 1726882590.23521: Calling groups_inventory to load vars for managed_node1 30529 1726882590.23525: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.23536: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.23538: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.23541: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.23904: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000000f6 30529 1726882590.23908: WORKER PROCESS EXITING 30529 1726882590.23928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.24136: done with get_vars() 30529 1726882590.24145: done getting variables 30529 1726882590.24200: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:30 -0400 (0:00:00.031) 0:00:04.268 ****** 30529 1726882590.24232: entering _queue_task() for managed_node1/debug 30529 1726882590.24565: worker is 1 (out of 1 available) 30529 1726882590.24574: exiting _queue_task() for managed_node1/debug 30529 1726882590.24585: done queuing things up, now waiting for results queue to drain 30529 1726882590.24589: waiting for pending results... 30529 1726882590.24721: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882590.24827: in run() - task 12673a56-9f93-b0f1-edc0-0000000000bb 30529 1726882590.24845: variable 'ansible_search_path' from source: unknown 30529 1726882590.24853: variable 'ansible_search_path' from source: unknown 30529 1726882590.24895: calling self._execute() 30529 1726882590.24967: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.24978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.25000: variable 'omit' from source: magic vars 30529 1726882590.25346: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.25366: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.25376: variable 'omit' from source: magic vars 30529 1726882590.25430: variable 'omit' from source: magic vars 30529 1726882590.25534: variable 'current_interfaces' from source: set_fact 30529 1726882590.25564: variable 'omit' from source: magic vars 30529 1726882590.25615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882590.25659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882590.25688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882590.25714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.25731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.25792: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882590.25797: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.25799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.25897: Set connection var ansible_shell_executable to /bin/sh 30529 1726882590.25912: Set connection var ansible_pipelining to False 30529 1726882590.25962: Set connection var ansible_shell_type to sh 30529 1726882590.25965: Set connection var ansible_timeout to 10 30529 1726882590.25967: Set connection var ansible_connection to ssh 30529 1726882590.25969: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882590.25971: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.25978: variable 'ansible_connection' from source: unknown 30529 1726882590.25984: variable 'ansible_module_compression' from source: unknown 30529 1726882590.25998: variable 'ansible_shell_type' from source: unknown 30529 1726882590.26008: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.26019: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.26026: variable 'ansible_pipelining' from source: unknown 30529 1726882590.26071: variable 'ansible_timeout' from source: unknown 30529 1726882590.26074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.26179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882590.26199: variable 'omit' from source: magic vars 30529 1726882590.26208: starting attempt loop 30529 1726882590.26215: running the handler 30529 1726882590.26268: handler run complete 30529 1726882590.26338: attempt loop complete, returning result 30529 1726882590.26341: _execute() done 30529 1726882590.26344: dumping result to json 30529 1726882590.26345: done dumping result, returning 30529 1726882590.26348: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-0000000000bb] 30529 1726882590.26350: sending task result for task 12673a56-9f93-b0f1-edc0-0000000000bb ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882590.26651: no more pending results, returning what we have 30529 1726882590.26654: results queue empty 30529 1726882590.26655: checking for any_errors_fatal 30529 1726882590.26658: done checking for any_errors_fatal 30529 1726882590.26659: checking for max_fail_percentage 30529 1726882590.26660: done checking for max_fail_percentage 30529 1726882590.26661: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.26662: done checking to see if all hosts have failed 30529 1726882590.26662: getting the remaining hosts for this loop 30529 1726882590.26664: done getting the remaining hosts for this loop 30529 1726882590.26667: getting the next task for host managed_node1 30529 1726882590.26673: done getting next task for host managed_node1 30529 1726882590.26676: ^ task is: TASK: Setup 30529 1726882590.26679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.26682: getting variables 30529 1726882590.26683: in VariableManager get_vars() 30529 1726882590.26711: Calling all_inventory to load vars for managed_node1 30529 1726882590.26714: Calling groups_inventory to load vars for managed_node1 30529 1726882590.26717: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.26726: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.26729: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.26732: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.26929: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000000bb 30529 1726882590.26933: WORKER PROCESS EXITING 30529 1726882590.26952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.27159: done with get_vars() 30529 1726882590.27167: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:36:30 -0400 (0:00:00.030) 0:00:04.298 ****** 30529 1726882590.27256: entering _queue_task() for managed_node1/include_tasks 30529 1726882590.27615: worker is 1 (out of 1 available) 30529 1726882590.27623: exiting _queue_task() for managed_node1/include_tasks 30529 1726882590.27632: done queuing things up, now waiting for results queue to drain 30529 1726882590.27633: waiting for pending results... 30529 1726882590.27723: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882590.27817: in run() - task 12673a56-9f93-b0f1-edc0-000000000094 30529 1726882590.27836: variable 'ansible_search_path' from source: unknown 30529 1726882590.27844: variable 'ansible_search_path' from source: unknown 30529 1726882590.27895: variable 'lsr_setup' from source: include params 30529 1726882590.28095: variable 'lsr_setup' from source: include params 30529 1726882590.28160: variable 'omit' from source: magic vars 30529 1726882590.28271: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.28284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.28307: variable 'omit' from source: magic vars 30529 1726882590.28585: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.28591: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.28595: variable 'item' from source: unknown 30529 1726882590.28631: variable 'item' from source: unknown 30529 1726882590.28667: variable 'item' from source: unknown 30529 1726882590.28741: variable 'item' from source: unknown 30529 1726882590.29053: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.29057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.29059: variable 'omit' from source: magic vars 30529 1726882590.29075: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.29089: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.29103: variable 'item' from source: unknown 30529 1726882590.29176: variable 'item' from source: unknown 30529 1726882590.29215: variable 'item' from source: unknown 30529 1726882590.29292: variable 'item' from source: unknown 30529 1726882590.29497: dumping result to json 30529 1726882590.29501: done dumping result, returning 30529 1726882590.29504: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-000000000094] 30529 1726882590.29506: sending task result for task 12673a56-9f93-b0f1-edc0-000000000094 30529 1726882590.29544: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000094 30529 1726882590.29547: WORKER PROCESS EXITING 30529 1726882590.29573: no more pending results, returning what we have 30529 1726882590.29578: in VariableManager get_vars() 30529 1726882590.29617: Calling all_inventory to load vars for managed_node1 30529 1726882590.29620: Calling groups_inventory to load vars for managed_node1 30529 1726882590.29624: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.29636: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.29640: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.29643: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.29953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.30149: done with get_vars() 30529 1726882590.30157: variable 'ansible_search_path' from source: unknown 30529 1726882590.30158: variable 'ansible_search_path' from source: unknown 30529 1726882590.30202: variable 'ansible_search_path' from source: unknown 30529 1726882590.30203: variable 'ansible_search_path' from source: unknown 30529 1726882590.30237: we have included files to process 30529 1726882590.30238: generating all_blocks data 30529 1726882590.30240: done generating all_blocks data 30529 1726882590.30243: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882590.30244: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882590.30247: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882590.30469: done processing included file 30529 1726882590.30471: iterating over new_blocks loaded from include file 30529 1726882590.30472: in VariableManager get_vars() 30529 1726882590.30488: done with get_vars() 30529 1726882590.30490: filtering new block on tags 30529 1726882590.30515: done filtering new block on tags 30529 1726882590.30517: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 => (item=tasks/delete_interface.yml) 30529 1726882590.30522: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882590.30523: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882590.30525: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882590.30644: in VariableManager get_vars() 30529 1726882590.30666: done with get_vars() 30529 1726882590.30788: done processing included file 30529 1726882590.30790: iterating over new_blocks loaded from include file 30529 1726882590.30791: in VariableManager get_vars() 30529 1726882590.30806: done with get_vars() 30529 1726882590.30808: filtering new block on tags 30529 1726882590.30869: done filtering new block on tags 30529 1726882590.30871: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 => (item=tasks/assert_device_absent.yml) 30529 1726882590.30880: extending task lists for all hosts with included blocks 30529 1726882590.31512: done extending task lists 30529 1726882590.31513: done processing included files 30529 1726882590.31514: results queue empty 30529 1726882590.31515: checking for any_errors_fatal 30529 1726882590.31518: done checking for any_errors_fatal 30529 1726882590.31519: checking for max_fail_percentage 30529 1726882590.31520: done checking for max_fail_percentage 30529 1726882590.31521: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.31521: done checking to see if all hosts have failed 30529 1726882590.31522: getting the remaining hosts for this loop 30529 1726882590.31523: done getting the remaining hosts for this loop 30529 1726882590.31526: getting the next task for host managed_node1 30529 1726882590.31535: done getting next task for host managed_node1 30529 1726882590.31537: ^ task is: TASK: Remove test interface if necessary 30529 1726882590.31540: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.31543: getting variables 30529 1726882590.31544: in VariableManager get_vars() 30529 1726882590.31556: Calling all_inventory to load vars for managed_node1 30529 1726882590.31558: Calling groups_inventory to load vars for managed_node1 30529 1726882590.31560: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.31565: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.31567: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.31570: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.31710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.31905: done with get_vars() 30529 1726882590.31912: done getting variables 30529 1726882590.31945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:30 -0400 (0:00:00.047) 0:00:04.345 ****** 30529 1726882590.31976: entering _queue_task() for managed_node1/command 30529 1726882590.32314: worker is 1 (out of 1 available) 30529 1726882590.32324: exiting _queue_task() for managed_node1/command 30529 1726882590.32336: done queuing things up, now waiting for results queue to drain 30529 1726882590.32337: waiting for pending results... 30529 1726882590.32505: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 30529 1726882590.32559: in run() - task 12673a56-9f93-b0f1-edc0-00000000011b 30529 1726882590.32571: variable 'ansible_search_path' from source: unknown 30529 1726882590.32574: variable 'ansible_search_path' from source: unknown 30529 1726882590.32604: calling self._execute() 30529 1726882590.32660: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.32664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.32674: variable 'omit' from source: magic vars 30529 1726882590.32938: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.32947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.32953: variable 'omit' from source: magic vars 30529 1726882590.32991: variable 'omit' from source: magic vars 30529 1726882590.33056: variable 'interface' from source: play vars 30529 1726882590.33081: variable 'omit' from source: magic vars 30529 1726882590.33116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882590.33144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882590.33158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882590.33172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.33181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.33206: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882590.33209: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.33212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.33283: Set connection var ansible_shell_executable to /bin/sh 30529 1726882590.33288: Set connection var ansible_pipelining to False 30529 1726882590.33291: Set connection var ansible_shell_type to sh 30529 1726882590.33298: Set connection var ansible_timeout to 10 30529 1726882590.33301: Set connection var ansible_connection to ssh 30529 1726882590.33306: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882590.33321: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.33324: variable 'ansible_connection' from source: unknown 30529 1726882590.33328: variable 'ansible_module_compression' from source: unknown 30529 1726882590.33330: variable 'ansible_shell_type' from source: unknown 30529 1726882590.33333: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.33336: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.33338: variable 'ansible_pipelining' from source: unknown 30529 1726882590.33341: variable 'ansible_timeout' from source: unknown 30529 1726882590.33349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.33438: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882590.33446: variable 'omit' from source: magic vars 30529 1726882590.33456: starting attempt loop 30529 1726882590.33459: running the handler 30529 1726882590.33467: _low_level_execute_command(): starting 30529 1726882590.33474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882590.33963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.33967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882590.33971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.34029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.34032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.34038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.34084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.35731: stdout chunk (state=3): >>>/root <<< 30529 1726882590.35823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.35854: stderr chunk (state=3): >>><<< 30529 1726882590.35856: stdout chunk (state=3): >>><<< 30529 1726882590.35872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.35901: _low_level_execute_command(): starting 30529 1726882590.35904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411 `" && echo ansible-tmp-1726882590.3587604-30705-17378576829411="` echo /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411 `" ) && sleep 0' 30529 1726882590.36265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.36308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882590.36312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882590.36319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.36322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.36325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.36357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.36360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.36418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.38316: stdout chunk (state=3): >>>ansible-tmp-1726882590.3587604-30705-17378576829411=/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411 <<< 30529 1726882590.38442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.38446: stdout chunk (state=3): >>><<< 30529 1726882590.38448: stderr chunk (state=3): >>><<< 30529 1726882590.38502: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882590.3587604-30705-17378576829411=/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.38600: variable 'ansible_module_compression' from source: unknown 30529 1726882590.38603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882590.38631: variable 'ansible_facts' from source: unknown 30529 1726882590.38721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py 30529 1726882590.38914: Sending initial data 30529 1726882590.38924: Sent initial data (155 bytes) 30529 1726882590.39444: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882590.39452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882590.39467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.39482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882590.39498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882590.39646: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.39650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.39652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.39654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.39697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.41324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882590.41366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882590.41454: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpn2g9c27o /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py <<< 30529 1726882590.41462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py" <<< 30529 1726882590.41517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpn2g9c27o" to remote "/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py" <<< 30529 1726882590.42752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.42756: stderr chunk (state=3): >>><<< 30529 1726882590.42758: stdout chunk (state=3): >>><<< 30529 1726882590.43012: done transferring module to remote 30529 1726882590.43015: _low_level_execute_command(): starting 30529 1726882590.43020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/ /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py && sleep 0' 30529 1726882590.43647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882590.43653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882590.43669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882590.43702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.43706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882590.43712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.43714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.43780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.43784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.43823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.43860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.45856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.45859: stdout chunk (state=3): >>><<< 30529 1726882590.45861: stderr chunk (state=3): >>><<< 30529 1726882590.45864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.45866: _low_level_execute_command(): starting 30529 1726882590.45868: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/AnsiballZ_command.py && sleep 0' 30529 1726882590.46664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882590.46762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.46769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.46780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.46849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.62644: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:36:30.618229", "end": "2024-09-20 21:36:30.625404", "delta": "0:00:00.007175", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882590.64004: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882590.64034: stderr chunk (state=3): >>><<< 30529 1726882590.64037: stdout chunk (state=3): >>><<< 30529 1726882590.64057: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:36:30.618229", "end": "2024-09-20 21:36:30.625404", "delta": "0:00:00.007175", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882590.64090: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882590.64096: _low_level_execute_command(): starting 30529 1726882590.64103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882590.3587604-30705-17378576829411/ > /dev/null 2>&1 && sleep 0' 30529 1726882590.64551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882590.64554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.64562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882590.64564: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.64568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.64619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.64626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.64628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.64667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.66451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.66478: stderr chunk (state=3): >>><<< 30529 1726882590.66481: stdout chunk (state=3): >>><<< 30529 1726882590.66501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.66504: handler run complete 30529 1726882590.66524: Evaluated conditional (False): False 30529 1726882590.66532: attempt loop complete, returning result 30529 1726882590.66535: _execute() done 30529 1726882590.66537: dumping result to json 30529 1726882590.66542: done dumping result, returning 30529 1726882590.66550: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [12673a56-9f93-b0f1-edc0-00000000011b] 30529 1726882590.66554: sending task result for task 12673a56-9f93-b0f1-edc0-00000000011b 30529 1726882590.66647: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000011b 30529 1726882590.66649: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007175", "end": "2024-09-20 21:36:30.625404", "rc": 1, "start": "2024-09-20 21:36:30.618229" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882590.66717: no more pending results, returning what we have 30529 1726882590.66721: results queue empty 30529 1726882590.66722: checking for any_errors_fatal 30529 1726882590.66724: done checking for any_errors_fatal 30529 1726882590.66724: checking for max_fail_percentage 30529 1726882590.66726: done checking for max_fail_percentage 30529 1726882590.66726: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.66727: done checking to see if all hosts have failed 30529 1726882590.66728: getting the remaining hosts for this loop 30529 1726882590.66729: done getting the remaining hosts for this loop 30529 1726882590.66733: getting the next task for host managed_node1 30529 1726882590.66742: done getting next task for host managed_node1 30529 1726882590.66744: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882590.66748: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.66751: getting variables 30529 1726882590.66753: in VariableManager get_vars() 30529 1726882590.66817: Calling all_inventory to load vars for managed_node1 30529 1726882590.66820: Calling groups_inventory to load vars for managed_node1 30529 1726882590.66824: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.66833: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.66835: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.66837: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.66960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.67077: done with get_vars() 30529 1726882590.67085: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:30 -0400 (0:00:00.351) 0:00:04.697 ****** 30529 1726882590.67154: entering _queue_task() for managed_node1/include_tasks 30529 1726882590.67348: worker is 1 (out of 1 available) 30529 1726882590.67361: exiting _queue_task() for managed_node1/include_tasks 30529 1726882590.67374: done queuing things up, now waiting for results queue to drain 30529 1726882590.67375: waiting for pending results... 30529 1726882590.67516: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882590.67574: in run() - task 12673a56-9f93-b0f1-edc0-00000000011f 30529 1726882590.67585: variable 'ansible_search_path' from source: unknown 30529 1726882590.67591: variable 'ansible_search_path' from source: unknown 30529 1726882590.67620: calling self._execute() 30529 1726882590.67672: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.67676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.67684: variable 'omit' from source: magic vars 30529 1726882590.67935: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.67945: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.67950: _execute() done 30529 1726882590.67953: dumping result to json 30529 1726882590.67957: done dumping result, returning 30529 1726882590.67964: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-00000000011f] 30529 1726882590.67968: sending task result for task 12673a56-9f93-b0f1-edc0-00000000011f 30529 1726882590.68051: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000011f 30529 1726882590.68054: WORKER PROCESS EXITING 30529 1726882590.68078: no more pending results, returning what we have 30529 1726882590.68082: in VariableManager get_vars() 30529 1726882590.68115: Calling all_inventory to load vars for managed_node1 30529 1726882590.68118: Calling groups_inventory to load vars for managed_node1 30529 1726882590.68120: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.68129: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.68131: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.68133: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.68258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.68384: done with get_vars() 30529 1726882590.68392: variable 'ansible_search_path' from source: unknown 30529 1726882590.68395: variable 'ansible_search_path' from source: unknown 30529 1726882590.68401: variable 'item' from source: include params 30529 1726882590.68477: variable 'item' from source: include params 30529 1726882590.68507: we have included files to process 30529 1726882590.68508: generating all_blocks data 30529 1726882590.68509: done generating all_blocks data 30529 1726882590.68512: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882590.68513: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882590.68514: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882590.68662: done processing included file 30529 1726882590.68663: iterating over new_blocks loaded from include file 30529 1726882590.68664: in VariableManager get_vars() 30529 1726882590.68673: done with get_vars() 30529 1726882590.68674: filtering new block on tags 30529 1726882590.68692: done filtering new block on tags 30529 1726882590.68695: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882590.68698: extending task lists for all hosts with included blocks 30529 1726882590.68788: done extending task lists 30529 1726882590.68789: done processing included files 30529 1726882590.68789: results queue empty 30529 1726882590.68789: checking for any_errors_fatal 30529 1726882590.68792: done checking for any_errors_fatal 30529 1726882590.68792: checking for max_fail_percentage 30529 1726882590.68794: done checking for max_fail_percentage 30529 1726882590.68795: checking to see if all hosts have failed and the running result is not ok 30529 1726882590.68796: done checking to see if all hosts have failed 30529 1726882590.68796: getting the remaining hosts for this loop 30529 1726882590.68797: done getting the remaining hosts for this loop 30529 1726882590.68799: getting the next task for host managed_node1 30529 1726882590.68801: done getting next task for host managed_node1 30529 1726882590.68803: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882590.68805: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882590.68806: getting variables 30529 1726882590.68807: in VariableManager get_vars() 30529 1726882590.68813: Calling all_inventory to load vars for managed_node1 30529 1726882590.68814: Calling groups_inventory to load vars for managed_node1 30529 1726882590.68816: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882590.68819: Calling all_plugins_play to load vars for managed_node1 30529 1726882590.68821: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882590.68822: Calling groups_plugins_play to load vars for managed_node1 30529 1726882590.68905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882590.69014: done with get_vars() 30529 1726882590.69020: done getting variables 30529 1726882590.69103: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:30 -0400 (0:00:00.019) 0:00:04.717 ****** 30529 1726882590.69122: entering _queue_task() for managed_node1/stat 30529 1726882590.69299: worker is 1 (out of 1 available) 30529 1726882590.69312: exiting _queue_task() for managed_node1/stat 30529 1726882590.69324: done queuing things up, now waiting for results queue to drain 30529 1726882590.69325: waiting for pending results... 30529 1726882590.69463: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882590.69527: in run() - task 12673a56-9f93-b0f1-edc0-00000000016e 30529 1726882590.69538: variable 'ansible_search_path' from source: unknown 30529 1726882590.69541: variable 'ansible_search_path' from source: unknown 30529 1726882590.69572: calling self._execute() 30529 1726882590.69624: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.69627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.69636: variable 'omit' from source: magic vars 30529 1726882590.69928: variable 'ansible_distribution_major_version' from source: facts 30529 1726882590.69937: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882590.69943: variable 'omit' from source: magic vars 30529 1726882590.69976: variable 'omit' from source: magic vars 30529 1726882590.70045: variable 'interface' from source: play vars 30529 1726882590.70058: variable 'omit' from source: magic vars 30529 1726882590.70086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882590.70120: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882590.70143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882590.70156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.70166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882590.70188: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882590.70196: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.70199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.70268: Set connection var ansible_shell_executable to /bin/sh 30529 1726882590.70271: Set connection var ansible_pipelining to False 30529 1726882590.70274: Set connection var ansible_shell_type to sh 30529 1726882590.70282: Set connection var ansible_timeout to 10 30529 1726882590.70284: Set connection var ansible_connection to ssh 30529 1726882590.70292: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882590.70310: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.70314: variable 'ansible_connection' from source: unknown 30529 1726882590.70317: variable 'ansible_module_compression' from source: unknown 30529 1726882590.70319: variable 'ansible_shell_type' from source: unknown 30529 1726882590.70321: variable 'ansible_shell_executable' from source: unknown 30529 1726882590.70323: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882590.70326: variable 'ansible_pipelining' from source: unknown 30529 1726882590.70328: variable 'ansible_timeout' from source: unknown 30529 1726882590.70330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882590.70466: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882590.70481: variable 'omit' from source: magic vars 30529 1726882590.70484: starting attempt loop 30529 1726882590.70486: running the handler 30529 1726882590.70502: _low_level_execute_command(): starting 30529 1726882590.70509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882590.71002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.71008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.71011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.71013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.71062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.71065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.71071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.71116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.72678: stdout chunk (state=3): >>>/root <<< 30529 1726882590.72778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.72804: stderr chunk (state=3): >>><<< 30529 1726882590.72808: stdout chunk (state=3): >>><<< 30529 1726882590.72826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.72868: _low_level_execute_command(): starting 30529 1726882590.72872: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968 `" && echo ansible-tmp-1726882590.7282364-30727-55401708987968="` echo /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968 `" ) && sleep 0' 30529 1726882590.73258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.73262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.73283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.73333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.73336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.73382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.75236: stdout chunk (state=3): >>>ansible-tmp-1726882590.7282364-30727-55401708987968=/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968 <<< 30529 1726882590.75339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.75359: stderr chunk (state=3): >>><<< 30529 1726882590.75362: stdout chunk (state=3): >>><<< 30529 1726882590.75376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882590.7282364-30727-55401708987968=/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.75413: variable 'ansible_module_compression' from source: unknown 30529 1726882590.75451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882590.75482: variable 'ansible_facts' from source: unknown 30529 1726882590.75544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py 30529 1726882590.75637: Sending initial data 30529 1726882590.75640: Sent initial data (152 bytes) 30529 1726882590.76064: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.76067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.76069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882590.76072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882590.76074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.76121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.76124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.76169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.77815: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882590.78199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882590.78202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpf5t_jnli /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py <<< 30529 1726882590.78205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py" <<< 30529 1726882590.78208: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpf5t_jnli" to remote "/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py" <<< 30529 1726882590.79425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.79599: stderr chunk (state=3): >>><<< 30529 1726882590.79603: stdout chunk (state=3): >>><<< 30529 1726882590.79605: done transferring module to remote 30529 1726882590.79607: _low_level_execute_command(): starting 30529 1726882590.79610: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/ /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py && sleep 0' 30529 1726882590.80708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.80840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.80844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.80896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.82600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882590.82634: stderr chunk (state=3): >>><<< 30529 1726882590.82638: stdout chunk (state=3): >>><<< 30529 1726882590.82653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882590.82656: _low_level_execute_command(): starting 30529 1726882590.82659: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/AnsiballZ_stat.py && sleep 0' 30529 1726882590.83341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882590.83350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882590.83372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882590.83389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882590.83401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882590.83410: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882590.83418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.83484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882590.83522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882590.83531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882590.83548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882590.83620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882590.98501: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882590.99662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882590.99689: stderr chunk (state=3): >>><<< 30529 1726882590.99701: stdout chunk (state=3): >>><<< 30529 1726882590.99714: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882590.99738: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882590.99747: _low_level_execute_command(): starting 30529 1726882590.99753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882590.7282364-30727-55401708987968/ > /dev/null 2>&1 && sleep 0' 30529 1726882591.00214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882591.00217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882591.00219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.00221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882591.00223: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882591.00225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.00278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882591.00282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882591.00286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.00328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882591.02126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882591.02151: stderr chunk (state=3): >>><<< 30529 1726882591.02156: stdout chunk (state=3): >>><<< 30529 1726882591.02171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882591.02177: handler run complete 30529 1726882591.02196: attempt loop complete, returning result 30529 1726882591.02199: _execute() done 30529 1726882591.02202: dumping result to json 30529 1726882591.02204: done dumping result, returning 30529 1726882591.02211: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-00000000016e] 30529 1726882591.02215: sending task result for task 12673a56-9f93-b0f1-edc0-00000000016e 30529 1726882591.02307: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000016e 30529 1726882591.02310: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882591.02364: no more pending results, returning what we have 30529 1726882591.02367: results queue empty 30529 1726882591.02368: checking for any_errors_fatal 30529 1726882591.02369: done checking for any_errors_fatal 30529 1726882591.02370: checking for max_fail_percentage 30529 1726882591.02371: done checking for max_fail_percentage 30529 1726882591.02372: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.02373: done checking to see if all hosts have failed 30529 1726882591.02373: getting the remaining hosts for this loop 30529 1726882591.02375: done getting the remaining hosts for this loop 30529 1726882591.02378: getting the next task for host managed_node1 30529 1726882591.02387: done getting next task for host managed_node1 30529 1726882591.02390: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30529 1726882591.02395: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.02399: getting variables 30529 1726882591.02401: in VariableManager get_vars() 30529 1726882591.02436: Calling all_inventory to load vars for managed_node1 30529 1726882591.02439: Calling groups_inventory to load vars for managed_node1 30529 1726882591.02442: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.02452: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.02455: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.02457: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.02626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.02739: done with get_vars() 30529 1726882591.02747: done getting variables 30529 1726882591.02821: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 30529 1726882591.02908: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:31 -0400 (0:00:00.338) 0:00:05.055 ****** 30529 1726882591.02930: entering _queue_task() for managed_node1/assert 30529 1726882591.02931: Creating lock for assert 30529 1726882591.03124: worker is 1 (out of 1 available) 30529 1726882591.03138: exiting _queue_task() for managed_node1/assert 30529 1726882591.03151: done queuing things up, now waiting for results queue to drain 30529 1726882591.03153: waiting for pending results... 30529 1726882591.03302: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' 30529 1726882591.03363: in run() - task 12673a56-9f93-b0f1-edc0-000000000120 30529 1726882591.03381: variable 'ansible_search_path' from source: unknown 30529 1726882591.03384: variable 'ansible_search_path' from source: unknown 30529 1726882591.03411: calling self._execute() 30529 1726882591.03463: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.03467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.03475: variable 'omit' from source: magic vars 30529 1726882591.03724: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.03734: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.03740: variable 'omit' from source: magic vars 30529 1726882591.03770: variable 'omit' from source: magic vars 30529 1726882591.03841: variable 'interface' from source: play vars 30529 1726882591.03855: variable 'omit' from source: magic vars 30529 1726882591.03885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882591.03915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882591.03999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882591.04002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882591.04005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882591.04007: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882591.04009: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.04012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.04055: Set connection var ansible_shell_executable to /bin/sh 30529 1726882591.04059: Set connection var ansible_pipelining to False 30529 1726882591.04061: Set connection var ansible_shell_type to sh 30529 1726882591.04069: Set connection var ansible_timeout to 10 30529 1726882591.04072: Set connection var ansible_connection to ssh 30529 1726882591.04076: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882591.04096: variable 'ansible_shell_executable' from source: unknown 30529 1726882591.04100: variable 'ansible_connection' from source: unknown 30529 1726882591.04102: variable 'ansible_module_compression' from source: unknown 30529 1726882591.04104: variable 'ansible_shell_type' from source: unknown 30529 1726882591.04106: variable 'ansible_shell_executable' from source: unknown 30529 1726882591.04108: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.04111: variable 'ansible_pipelining' from source: unknown 30529 1726882591.04113: variable 'ansible_timeout' from source: unknown 30529 1726882591.04118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.04217: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882591.04226: variable 'omit' from source: magic vars 30529 1726882591.04231: starting attempt loop 30529 1726882591.04234: running the handler 30529 1726882591.04330: variable 'interface_stat' from source: set_fact 30529 1726882591.04337: Evaluated conditional (not interface_stat.stat.exists): True 30529 1726882591.04342: handler run complete 30529 1726882591.04354: attempt loop complete, returning result 30529 1726882591.04356: _execute() done 30529 1726882591.04361: dumping result to json 30529 1726882591.04363: done dumping result, returning 30529 1726882591.04366: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' [12673a56-9f93-b0f1-edc0-000000000120] 30529 1726882591.04376: sending task result for task 12673a56-9f93-b0f1-edc0-000000000120 30529 1726882591.04449: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000120 30529 1726882591.04452: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882591.04524: no more pending results, returning what we have 30529 1726882591.04527: results queue empty 30529 1726882591.04528: checking for any_errors_fatal 30529 1726882591.04533: done checking for any_errors_fatal 30529 1726882591.04534: checking for max_fail_percentage 30529 1726882591.04535: done checking for max_fail_percentage 30529 1726882591.04536: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.04537: done checking to see if all hosts have failed 30529 1726882591.04537: getting the remaining hosts for this loop 30529 1726882591.04539: done getting the remaining hosts for this loop 30529 1726882591.04542: getting the next task for host managed_node1 30529 1726882591.04548: done getting next task for host managed_node1 30529 1726882591.04550: ^ task is: TASK: Test 30529 1726882591.04552: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.04555: getting variables 30529 1726882591.04557: in VariableManager get_vars() 30529 1726882591.04578: Calling all_inventory to load vars for managed_node1 30529 1726882591.04580: Calling groups_inventory to load vars for managed_node1 30529 1726882591.04583: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.04595: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.04598: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.04601: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.04702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.04835: done with get_vars() 30529 1726882591.04842: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:36:31 -0400 (0:00:00.019) 0:00:05.075 ****** 30529 1726882591.04902: entering _queue_task() for managed_node1/include_tasks 30529 1726882591.05075: worker is 1 (out of 1 available) 30529 1726882591.05088: exiting _queue_task() for managed_node1/include_tasks 30529 1726882591.05104: done queuing things up, now waiting for results queue to drain 30529 1726882591.05105: waiting for pending results... 30529 1726882591.05242: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882591.05302: in run() - task 12673a56-9f93-b0f1-edc0-000000000095 30529 1726882591.05314: variable 'ansible_search_path' from source: unknown 30529 1726882591.05317: variable 'ansible_search_path' from source: unknown 30529 1726882591.05357: variable 'lsr_test' from source: include params 30529 1726882591.05497: variable 'lsr_test' from source: include params 30529 1726882591.05547: variable 'omit' from source: magic vars 30529 1726882591.05625: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.05631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.05639: variable 'omit' from source: magic vars 30529 1726882591.05792: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.05799: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.05805: variable 'item' from source: unknown 30529 1726882591.05848: variable 'item' from source: unknown 30529 1726882591.05868: variable 'item' from source: unknown 30529 1726882591.05915: variable 'item' from source: unknown 30529 1726882591.06026: dumping result to json 30529 1726882591.06029: done dumping result, returning 30529 1726882591.06031: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-000000000095] 30529 1726882591.06033: sending task result for task 12673a56-9f93-b0f1-edc0-000000000095 30529 1726882591.06067: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000095 30529 1726882591.06070: WORKER PROCESS EXITING 30529 1726882591.06091: no more pending results, returning what we have 30529 1726882591.06097: in VariableManager get_vars() 30529 1726882591.06122: Calling all_inventory to load vars for managed_node1 30529 1726882591.06124: Calling groups_inventory to load vars for managed_node1 30529 1726882591.06127: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.06135: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.06138: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.06140: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.06250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.06359: done with get_vars() 30529 1726882591.06364: variable 'ansible_search_path' from source: unknown 30529 1726882591.06365: variable 'ansible_search_path' from source: unknown 30529 1726882591.06394: we have included files to process 30529 1726882591.06395: generating all_blocks data 30529 1726882591.06396: done generating all_blocks data 30529 1726882591.06398: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882591.06399: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882591.06401: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882591.06598: done processing included file 30529 1726882591.06600: iterating over new_blocks loaded from include file 30529 1726882591.06601: in VariableManager get_vars() 30529 1726882591.06610: done with get_vars() 30529 1726882591.06611: filtering new block on tags 30529 1726882591.06631: done filtering new block on tags 30529 1726882591.06632: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node1 => (item=tasks/create_bridge_profile.yml) 30529 1726882591.06635: extending task lists for all hosts with included blocks 30529 1726882591.07285: done extending task lists 30529 1726882591.07288: done processing included files 30529 1726882591.07288: results queue empty 30529 1726882591.07289: checking for any_errors_fatal 30529 1726882591.07290: done checking for any_errors_fatal 30529 1726882591.07291: checking for max_fail_percentage 30529 1726882591.07291: done checking for max_fail_percentage 30529 1726882591.07292: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.07294: done checking to see if all hosts have failed 30529 1726882591.07294: getting the remaining hosts for this loop 30529 1726882591.07295: done getting the remaining hosts for this loop 30529 1726882591.07297: getting the next task for host managed_node1 30529 1726882591.07300: done getting next task for host managed_node1 30529 1726882591.07301: ^ task is: TASK: Include network role 30529 1726882591.07303: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.07304: getting variables 30529 1726882591.07305: in VariableManager get_vars() 30529 1726882591.07310: Calling all_inventory to load vars for managed_node1 30529 1726882591.07312: Calling groups_inventory to load vars for managed_node1 30529 1726882591.07313: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.07316: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.07317: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.07319: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.07400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.07506: done with get_vars() 30529 1726882591.07512: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:36:31 -0400 (0:00:00.026) 0:00:05.101 ****** 30529 1726882591.07554: entering _queue_task() for managed_node1/include_role 30529 1726882591.07555: Creating lock for include_role 30529 1726882591.07735: worker is 1 (out of 1 available) 30529 1726882591.07747: exiting _queue_task() for managed_node1/include_role 30529 1726882591.07759: done queuing things up, now waiting for results queue to drain 30529 1726882591.07761: waiting for pending results... 30529 1726882591.07899: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882591.07963: in run() - task 12673a56-9f93-b0f1-edc0-00000000018e 30529 1726882591.07972: variable 'ansible_search_path' from source: unknown 30529 1726882591.07976: variable 'ansible_search_path' from source: unknown 30529 1726882591.08005: calling self._execute() 30529 1726882591.08057: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.08061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.08069: variable 'omit' from source: magic vars 30529 1726882591.08327: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.08338: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.08341: _execute() done 30529 1726882591.08344: dumping result to json 30529 1726882591.08353: done dumping result, returning 30529 1726882591.08356: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-00000000018e] 30529 1726882591.08359: sending task result for task 12673a56-9f93-b0f1-edc0-00000000018e 30529 1726882591.08456: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000018e 30529 1726882591.08459: WORKER PROCESS EXITING 30529 1726882591.08483: no more pending results, returning what we have 30529 1726882591.08489: in VariableManager get_vars() 30529 1726882591.08514: Calling all_inventory to load vars for managed_node1 30529 1726882591.08516: Calling groups_inventory to load vars for managed_node1 30529 1726882591.08519: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.08527: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.08529: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.08531: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.08665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.08780: done with get_vars() 30529 1726882591.08785: variable 'ansible_search_path' from source: unknown 30529 1726882591.08787: variable 'ansible_search_path' from source: unknown 30529 1726882591.08894: variable 'omit' from source: magic vars 30529 1726882591.08921: variable 'omit' from source: magic vars 30529 1726882591.08930: variable 'omit' from source: magic vars 30529 1726882591.08932: we have included files to process 30529 1726882591.08932: generating all_blocks data 30529 1726882591.08933: done generating all_blocks data 30529 1726882591.08934: processing included file: fedora.linux_system_roles.network 30529 1726882591.08947: in VariableManager get_vars() 30529 1726882591.08954: done with get_vars() 30529 1726882591.08998: in VariableManager get_vars() 30529 1726882591.09008: done with get_vars() 30529 1726882591.09040: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882591.09195: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882591.09276: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882591.09683: in VariableManager get_vars() 30529 1726882591.09699: done with get_vars() 30529 1726882591.09971: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882591.11174: iterating over new_blocks loaded from include file 30529 1726882591.11175: in VariableManager get_vars() 30529 1726882591.11195: done with get_vars() 30529 1726882591.11197: filtering new block on tags 30529 1726882591.11464: done filtering new block on tags 30529 1726882591.11468: in VariableManager get_vars() 30529 1726882591.11480: done with get_vars() 30529 1726882591.11482: filtering new block on tags 30529 1726882591.11502: done filtering new block on tags 30529 1726882591.11504: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882591.11508: extending task lists for all hosts with included blocks 30529 1726882591.11657: done extending task lists 30529 1726882591.11658: done processing included files 30529 1726882591.11659: results queue empty 30529 1726882591.11660: checking for any_errors_fatal 30529 1726882591.11662: done checking for any_errors_fatal 30529 1726882591.11663: checking for max_fail_percentage 30529 1726882591.11664: done checking for max_fail_percentage 30529 1726882591.11665: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.11666: done checking to see if all hosts have failed 30529 1726882591.11666: getting the remaining hosts for this loop 30529 1726882591.11668: done getting the remaining hosts for this loop 30529 1726882591.11670: getting the next task for host managed_node1 30529 1726882591.11674: done getting next task for host managed_node1 30529 1726882591.11676: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882591.11679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.11690: getting variables 30529 1726882591.11691: in VariableManager get_vars() 30529 1726882591.11704: Calling all_inventory to load vars for managed_node1 30529 1726882591.11706: Calling groups_inventory to load vars for managed_node1 30529 1726882591.11708: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.11713: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.11715: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.11717: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.11864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.12043: done with get_vars() 30529 1726882591.12054: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:31 -0400 (0:00:00.045) 0:00:05.147 ****** 30529 1726882591.12112: entering _queue_task() for managed_node1/include_tasks 30529 1726882591.12302: worker is 1 (out of 1 available) 30529 1726882591.12315: exiting _queue_task() for managed_node1/include_tasks 30529 1726882591.12327: done queuing things up, now waiting for results queue to drain 30529 1726882591.12329: waiting for pending results... 30529 1726882591.12482: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882591.12555: in run() - task 12673a56-9f93-b0f1-edc0-00000000020c 30529 1726882591.12566: variable 'ansible_search_path' from source: unknown 30529 1726882591.12570: variable 'ansible_search_path' from source: unknown 30529 1726882591.12598: calling self._execute() 30529 1726882591.12659: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.12662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.12669: variable 'omit' from source: magic vars 30529 1726882591.13098: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.13102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.13104: _execute() done 30529 1726882591.13107: dumping result to json 30529 1726882591.13109: done dumping result, returning 30529 1726882591.13111: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-00000000020c] 30529 1726882591.13113: sending task result for task 12673a56-9f93-b0f1-edc0-00000000020c 30529 1726882591.13172: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000020c 30529 1726882591.13176: WORKER PROCESS EXITING 30529 1726882591.13271: no more pending results, returning what we have 30529 1726882591.13275: in VariableManager get_vars() 30529 1726882591.13309: Calling all_inventory to load vars for managed_node1 30529 1726882591.13312: Calling groups_inventory to load vars for managed_node1 30529 1726882591.13314: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.13321: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.13324: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.13327: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.13484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.13691: done with get_vars() 30529 1726882591.13701: variable 'ansible_search_path' from source: unknown 30529 1726882591.13703: variable 'ansible_search_path' from source: unknown 30529 1726882591.13740: we have included files to process 30529 1726882591.13741: generating all_blocks data 30529 1726882591.13743: done generating all_blocks data 30529 1726882591.13746: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882591.13747: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882591.13749: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882591.14688: done processing included file 30529 1726882591.14690: iterating over new_blocks loaded from include file 30529 1726882591.14691: in VariableManager get_vars() 30529 1726882591.14712: done with get_vars() 30529 1726882591.14713: filtering new block on tags 30529 1726882591.14738: done filtering new block on tags 30529 1726882591.14740: in VariableManager get_vars() 30529 1726882591.14758: done with get_vars() 30529 1726882591.14759: filtering new block on tags 30529 1726882591.14813: done filtering new block on tags 30529 1726882591.14816: in VariableManager get_vars() 30529 1726882591.14837: done with get_vars() 30529 1726882591.14839: filtering new block on tags 30529 1726882591.14878: done filtering new block on tags 30529 1726882591.14880: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882591.14885: extending task lists for all hosts with included blocks 30529 1726882591.17196: done extending task lists 30529 1726882591.17198: done processing included files 30529 1726882591.17198: results queue empty 30529 1726882591.17199: checking for any_errors_fatal 30529 1726882591.17202: done checking for any_errors_fatal 30529 1726882591.17203: checking for max_fail_percentage 30529 1726882591.17204: done checking for max_fail_percentage 30529 1726882591.17204: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.17205: done checking to see if all hosts have failed 30529 1726882591.17206: getting the remaining hosts for this loop 30529 1726882591.17207: done getting the remaining hosts for this loop 30529 1726882591.17210: getting the next task for host managed_node1 30529 1726882591.17215: done getting next task for host managed_node1 30529 1726882591.17218: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882591.17222: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.17231: getting variables 30529 1726882591.17232: in VariableManager get_vars() 30529 1726882591.17243: Calling all_inventory to load vars for managed_node1 30529 1726882591.17245: Calling groups_inventory to load vars for managed_node1 30529 1726882591.17247: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.17251: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.17254: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.17256: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.17617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.17966: done with get_vars() 30529 1726882591.17974: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:31 -0400 (0:00:00.059) 0:00:05.206 ****** 30529 1726882591.18044: entering _queue_task() for managed_node1/setup 30529 1726882591.19134: worker is 1 (out of 1 available) 30529 1726882591.19145: exiting _queue_task() for managed_node1/setup 30529 1726882591.19159: done queuing things up, now waiting for results queue to drain 30529 1726882591.19160: waiting for pending results... 30529 1726882591.19440: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882591.19826: in run() - task 12673a56-9f93-b0f1-edc0-000000000269 30529 1726882591.19830: variable 'ansible_search_path' from source: unknown 30529 1726882591.19832: variable 'ansible_search_path' from source: unknown 30529 1726882591.19835: calling self._execute() 30529 1726882591.19972: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.20006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.20051: variable 'omit' from source: magic vars 30529 1726882591.20686: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.20797: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.21249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882591.24156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882591.24239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882591.24278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882591.24320: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882591.24370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882591.24455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882591.24492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882591.24539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882591.24672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882591.24675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882591.24678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882591.24702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882591.24732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882591.24780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882591.24805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882591.24955: variable '__network_required_facts' from source: role '' defaults 30529 1726882591.24969: variable 'ansible_facts' from source: unknown 30529 1726882591.25053: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882591.25061: when evaluation is False, skipping this task 30529 1726882591.25068: _execute() done 30529 1726882591.25074: dumping result to json 30529 1726882591.25082: done dumping result, returning 30529 1726882591.25099: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000000269] 30529 1726882591.25114: sending task result for task 12673a56-9f93-b0f1-edc0-000000000269 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882591.25365: no more pending results, returning what we have 30529 1726882591.25368: results queue empty 30529 1726882591.25369: checking for any_errors_fatal 30529 1726882591.25371: done checking for any_errors_fatal 30529 1726882591.25371: checking for max_fail_percentage 30529 1726882591.25373: done checking for max_fail_percentage 30529 1726882591.25374: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.25374: done checking to see if all hosts have failed 30529 1726882591.25375: getting the remaining hosts for this loop 30529 1726882591.25377: done getting the remaining hosts for this loop 30529 1726882591.25381: getting the next task for host managed_node1 30529 1726882591.25397: done getting next task for host managed_node1 30529 1726882591.25401: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882591.25407: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.25607: getting variables 30529 1726882591.25609: in VariableManager get_vars() 30529 1726882591.25643: Calling all_inventory to load vars for managed_node1 30529 1726882591.25646: Calling groups_inventory to load vars for managed_node1 30529 1726882591.25649: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.25657: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.25660: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.25663: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.25894: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000269 30529 1726882591.25902: WORKER PROCESS EXITING 30529 1726882591.25928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.26154: done with get_vars() 30529 1726882591.26164: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:31 -0400 (0:00:00.082) 0:00:05.288 ****** 30529 1726882591.26265: entering _queue_task() for managed_node1/stat 30529 1726882591.26534: worker is 1 (out of 1 available) 30529 1726882591.26547: exiting _queue_task() for managed_node1/stat 30529 1726882591.26558: done queuing things up, now waiting for results queue to drain 30529 1726882591.26560: waiting for pending results... 30529 1726882591.26832: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882591.26980: in run() - task 12673a56-9f93-b0f1-edc0-00000000026b 30529 1726882591.27011: variable 'ansible_search_path' from source: unknown 30529 1726882591.27019: variable 'ansible_search_path' from source: unknown 30529 1726882591.27055: calling self._execute() 30529 1726882591.27141: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.27152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.27169: variable 'omit' from source: magic vars 30529 1726882591.27532: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.27556: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.27734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882591.28029: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882591.28075: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882591.28126: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882591.28160: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882591.28282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882591.28323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882591.28353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882591.28383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882591.28481: variable '__network_is_ostree' from source: set_fact 30529 1726882591.28499: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882591.28531: when evaluation is False, skipping this task 30529 1726882591.28539: _execute() done 30529 1726882591.28545: dumping result to json 30529 1726882591.28700: done dumping result, returning 30529 1726882591.28704: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-00000000026b] 30529 1726882591.28706: sending task result for task 12673a56-9f93-b0f1-edc0-00000000026b 30529 1726882591.28772: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000026b 30529 1726882591.28775: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882591.28854: no more pending results, returning what we have 30529 1726882591.28858: results queue empty 30529 1726882591.28859: checking for any_errors_fatal 30529 1726882591.28866: done checking for any_errors_fatal 30529 1726882591.28867: checking for max_fail_percentage 30529 1726882591.28868: done checking for max_fail_percentage 30529 1726882591.28869: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.28870: done checking to see if all hosts have failed 30529 1726882591.28871: getting the remaining hosts for this loop 30529 1726882591.28872: done getting the remaining hosts for this loop 30529 1726882591.28876: getting the next task for host managed_node1 30529 1726882591.28884: done getting next task for host managed_node1 30529 1726882591.28890: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882591.28899: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.28911: getting variables 30529 1726882591.28913: in VariableManager get_vars() 30529 1726882591.28946: Calling all_inventory to load vars for managed_node1 30529 1726882591.28949: Calling groups_inventory to load vars for managed_node1 30529 1726882591.28951: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.28961: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.28964: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.28967: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.29373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.29625: done with get_vars() 30529 1726882591.29635: done getting variables 30529 1726882591.29695: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:31 -0400 (0:00:00.034) 0:00:05.323 ****** 30529 1726882591.29729: entering _queue_task() for managed_node1/set_fact 30529 1726882591.29964: worker is 1 (out of 1 available) 30529 1726882591.30092: exiting _queue_task() for managed_node1/set_fact 30529 1726882591.30106: done queuing things up, now waiting for results queue to drain 30529 1726882591.30108: waiting for pending results... 30529 1726882591.30260: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882591.30441: in run() - task 12673a56-9f93-b0f1-edc0-00000000026c 30529 1726882591.30445: variable 'ansible_search_path' from source: unknown 30529 1726882591.30535: variable 'ansible_search_path' from source: unknown 30529 1726882591.30539: calling self._execute() 30529 1726882591.30575: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.30584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.30604: variable 'omit' from source: magic vars 30529 1726882591.31041: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.31055: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.31206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882591.31505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882591.31553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882591.31590: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882591.31637: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882591.31728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882591.31799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882591.31803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882591.31828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882591.31923: variable '__network_is_ostree' from source: set_fact 30529 1726882591.31954: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882591.31981: when evaluation is False, skipping this task 30529 1726882591.32046: _execute() done 30529 1726882591.32050: dumping result to json 30529 1726882591.32052: done dumping result, returning 30529 1726882591.32055: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-00000000026c] 30529 1726882591.32057: sending task result for task 12673a56-9f93-b0f1-edc0-00000000026c 30529 1726882591.32124: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000026c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882591.32171: no more pending results, returning what we have 30529 1726882591.32175: results queue empty 30529 1726882591.32176: checking for any_errors_fatal 30529 1726882591.32180: done checking for any_errors_fatal 30529 1726882591.32181: checking for max_fail_percentage 30529 1726882591.32182: done checking for max_fail_percentage 30529 1726882591.32183: checking to see if all hosts have failed and the running result is not ok 30529 1726882591.32184: done checking to see if all hosts have failed 30529 1726882591.32185: getting the remaining hosts for this loop 30529 1726882591.32188: done getting the remaining hosts for this loop 30529 1726882591.32192: getting the next task for host managed_node1 30529 1726882591.32203: done getting next task for host managed_node1 30529 1726882591.32206: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882591.32212: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882591.32224: getting variables 30529 1726882591.32225: in VariableManager get_vars() 30529 1726882591.32258: Calling all_inventory to load vars for managed_node1 30529 1726882591.32260: Calling groups_inventory to load vars for managed_node1 30529 1726882591.32262: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882591.32271: Calling all_plugins_play to load vars for managed_node1 30529 1726882591.32274: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882591.32276: Calling groups_plugins_play to load vars for managed_node1 30529 1726882591.32794: WORKER PROCESS EXITING 30529 1726882591.32819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882591.33225: done with get_vars() 30529 1726882591.33235: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:31 -0400 (0:00:00.037) 0:00:05.360 ****** 30529 1726882591.33453: entering _queue_task() for managed_node1/service_facts 30529 1726882591.33455: Creating lock for service_facts 30529 1726882591.33990: worker is 1 (out of 1 available) 30529 1726882591.34004: exiting _queue_task() for managed_node1/service_facts 30529 1726882591.34016: done queuing things up, now waiting for results queue to drain 30529 1726882591.34018: waiting for pending results... 30529 1726882591.34322: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882591.34461: in run() - task 12673a56-9f93-b0f1-edc0-00000000026e 30529 1726882591.34481: variable 'ansible_search_path' from source: unknown 30529 1726882591.34492: variable 'ansible_search_path' from source: unknown 30529 1726882591.34625: calling self._execute() 30529 1726882591.34633: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.34644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.34657: variable 'omit' from source: magic vars 30529 1726882591.35015: variable 'ansible_distribution_major_version' from source: facts 30529 1726882591.35031: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882591.35047: variable 'omit' from source: magic vars 30529 1726882591.35128: variable 'omit' from source: magic vars 30529 1726882591.35169: variable 'omit' from source: magic vars 30529 1726882591.35215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882591.35253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882591.35288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882591.35313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882591.35329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882591.35359: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882591.35373: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.35381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.35492: Set connection var ansible_shell_executable to /bin/sh 30529 1726882591.35600: Set connection var ansible_pipelining to False 30529 1726882591.35603: Set connection var ansible_shell_type to sh 30529 1726882591.35606: Set connection var ansible_timeout to 10 30529 1726882591.35608: Set connection var ansible_connection to ssh 30529 1726882591.35610: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882591.35612: variable 'ansible_shell_executable' from source: unknown 30529 1726882591.35615: variable 'ansible_connection' from source: unknown 30529 1726882591.35617: variable 'ansible_module_compression' from source: unknown 30529 1726882591.35618: variable 'ansible_shell_type' from source: unknown 30529 1726882591.35620: variable 'ansible_shell_executable' from source: unknown 30529 1726882591.35622: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882591.35624: variable 'ansible_pipelining' from source: unknown 30529 1726882591.35626: variable 'ansible_timeout' from source: unknown 30529 1726882591.35628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882591.35792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882591.35818: variable 'omit' from source: magic vars 30529 1726882591.35829: starting attempt loop 30529 1726882591.35836: running the handler 30529 1726882591.35859: _low_level_execute_command(): starting 30529 1726882591.35872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882591.36589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882591.36608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882591.36691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.36732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882591.36747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882591.36765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.36845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882591.38635: stdout chunk (state=3): >>>/root <<< 30529 1726882591.38770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882591.38773: stdout chunk (state=3): >>><<< 30529 1726882591.38775: stderr chunk (state=3): >>><<< 30529 1726882591.38780: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882591.38782: _low_level_execute_command(): starting 30529 1726882591.38784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906 `" && echo ansible-tmp-1726882591.386807-30765-261981920114906="` echo /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906 `" ) && sleep 0' 30529 1726882591.39328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882591.39415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.39456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882591.39470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882591.39492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.39567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882591.41419: stdout chunk (state=3): >>>ansible-tmp-1726882591.386807-30765-261981920114906=/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906 <<< 30529 1726882591.41563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882591.41575: stdout chunk (state=3): >>><<< 30529 1726882591.41596: stderr chunk (state=3): >>><<< 30529 1726882591.41799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882591.386807-30765-261981920114906=/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882591.41802: variable 'ansible_module_compression' from source: unknown 30529 1726882591.41805: ANSIBALLZ: Using lock for service_facts 30529 1726882591.41807: ANSIBALLZ: Acquiring lock 30529 1726882591.41809: ANSIBALLZ: Lock acquired: 139794688818496 30529 1726882591.41811: ANSIBALLZ: Creating module 30529 1726882591.55682: ANSIBALLZ: Writing module into payload 30529 1726882591.55785: ANSIBALLZ: Writing module 30529 1726882591.55820: ANSIBALLZ: Renaming module 30529 1726882591.55838: ANSIBALLZ: Done creating module 30529 1726882591.55859: variable 'ansible_facts' from source: unknown 30529 1726882591.55943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py 30529 1726882591.56279: Sending initial data 30529 1726882591.56283: Sent initial data (161 bytes) 30529 1726882591.57153: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882591.57196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882591.57208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882591.57290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882591.57318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882591.57335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.57420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882591.59340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py" <<< 30529 1726882591.59345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgkuol7os /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py <<< 30529 1726882591.59347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgkuol7os" to remote "/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py" <<< 30529 1726882591.60690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882591.60883: stderr chunk (state=3): >>><<< 30529 1726882591.60886: stdout chunk (state=3): >>><<< 30529 1726882591.60888: done transferring module to remote 30529 1726882591.60890: _low_level_execute_command(): starting 30529 1726882591.60899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/ /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py && sleep 0' 30529 1726882591.61408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882591.61411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882591.61413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882591.61416: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882591.61417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882591.61420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.61469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882591.61476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882591.61505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.61569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882591.63363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882591.63373: stdout chunk (state=3): >>><<< 30529 1726882591.63387: stderr chunk (state=3): >>><<< 30529 1726882591.63468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882591.63471: _low_level_execute_command(): starting 30529 1726882591.63474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/AnsiballZ_service_facts.py && sleep 0' 30529 1726882591.63976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882591.63980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882591.63982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.63984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882591.63986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882591.64082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882591.64130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.15983: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882593.17472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.17475: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882593.17478: stdout chunk (state=3): >>><<< 30529 1726882593.17480: stderr chunk (state=3): >>><<< 30529 1726882593.17506: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882593.18292: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882593.18298: _low_level_execute_command(): starting 30529 1726882593.18301: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882591.386807-30765-261981920114906/ > /dev/null 2>&1 && sleep 0' 30529 1726882593.19012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882593.19016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.19067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882593.19095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882593.19115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.19416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.21011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.21059: stderr chunk (state=3): >>><<< 30529 1726882593.21074: stdout chunk (state=3): >>><<< 30529 1726882593.21096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882593.21109: handler run complete 30529 1726882593.21470: variable 'ansible_facts' from source: unknown 30529 1726882593.21853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882593.22868: variable 'ansible_facts' from source: unknown 30529 1726882593.23300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882593.23675: attempt loop complete, returning result 30529 1726882593.23688: _execute() done 30529 1726882593.23698: dumping result to json 30529 1726882593.23831: done dumping result, returning 30529 1726882593.23853: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-00000000026e] 30529 1726882593.24092: sending task result for task 12673a56-9f93-b0f1-edc0-00000000026e 30529 1726882593.26136: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000026e 30529 1726882593.26140: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882593.26233: no more pending results, returning what we have 30529 1726882593.26236: results queue empty 30529 1726882593.26236: checking for any_errors_fatal 30529 1726882593.26239: done checking for any_errors_fatal 30529 1726882593.26239: checking for max_fail_percentage 30529 1726882593.26241: done checking for max_fail_percentage 30529 1726882593.26241: checking to see if all hosts have failed and the running result is not ok 30529 1726882593.26242: done checking to see if all hosts have failed 30529 1726882593.26243: getting the remaining hosts for this loop 30529 1726882593.26244: done getting the remaining hosts for this loop 30529 1726882593.26248: getting the next task for host managed_node1 30529 1726882593.26254: done getting next task for host managed_node1 30529 1726882593.26257: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882593.26263: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882593.26271: getting variables 30529 1726882593.26273: in VariableManager get_vars() 30529 1726882593.26302: Calling all_inventory to load vars for managed_node1 30529 1726882593.26305: Calling groups_inventory to load vars for managed_node1 30529 1726882593.26307: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882593.26315: Calling all_plugins_play to load vars for managed_node1 30529 1726882593.26318: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882593.26320: Calling groups_plugins_play to load vars for managed_node1 30529 1726882593.27027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882593.28156: done with get_vars() 30529 1726882593.28169: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:33 -0400 (0:00:01.948) 0:00:07.309 ****** 30529 1726882593.28325: entering _queue_task() for managed_node1/package_facts 30529 1726882593.28327: Creating lock for package_facts 30529 1726882593.28955: worker is 1 (out of 1 available) 30529 1726882593.28968: exiting _queue_task() for managed_node1/package_facts 30529 1726882593.28983: done queuing things up, now waiting for results queue to drain 30529 1726882593.28985: waiting for pending results... 30529 1726882593.29709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882593.30099: in run() - task 12673a56-9f93-b0f1-edc0-00000000026f 30529 1726882593.30103: variable 'ansible_search_path' from source: unknown 30529 1726882593.30106: variable 'ansible_search_path' from source: unknown 30529 1726882593.30109: calling self._execute() 30529 1726882593.30111: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882593.30115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882593.30118: variable 'omit' from source: magic vars 30529 1726882593.30810: variable 'ansible_distribution_major_version' from source: facts 30529 1726882593.30828: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882593.30840: variable 'omit' from source: magic vars 30529 1726882593.31301: variable 'omit' from source: magic vars 30529 1726882593.31304: variable 'omit' from source: magic vars 30529 1726882593.31306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882593.31309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882593.31311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882593.31313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882593.31316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882593.31698: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882593.31701: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882593.31703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882593.31705: Set connection var ansible_shell_executable to /bin/sh 30529 1726882593.31708: Set connection var ansible_pipelining to False 30529 1726882593.31711: Set connection var ansible_shell_type to sh 30529 1726882593.31713: Set connection var ansible_timeout to 10 30529 1726882593.31716: Set connection var ansible_connection to ssh 30529 1726882593.31719: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882593.31721: variable 'ansible_shell_executable' from source: unknown 30529 1726882593.31724: variable 'ansible_connection' from source: unknown 30529 1726882593.31727: variable 'ansible_module_compression' from source: unknown 30529 1726882593.31730: variable 'ansible_shell_type' from source: unknown 30529 1726882593.31732: variable 'ansible_shell_executable' from source: unknown 30529 1726882593.31735: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882593.31738: variable 'ansible_pipelining' from source: unknown 30529 1726882593.31740: variable 'ansible_timeout' from source: unknown 30529 1726882593.31742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882593.32133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882593.32149: variable 'omit' from source: magic vars 30529 1726882593.32159: starting attempt loop 30529 1726882593.32165: running the handler 30529 1726882593.32183: _low_level_execute_command(): starting 30529 1726882593.32191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882593.33446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.33636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.33750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.35282: stdout chunk (state=3): >>>/root <<< 30529 1726882593.35414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.35430: stdout chunk (state=3): >>><<< 30529 1726882593.35443: stderr chunk (state=3): >>><<< 30529 1726882593.35466: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882593.35492: _low_level_execute_command(): starting 30529 1726882593.35506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748 `" && echo ansible-tmp-1726882593.3547757-30859-25875477139748="` echo /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748 `" ) && sleep 0' 30529 1726882593.36113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882593.36129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882593.36147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882593.36198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882593.36308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.36417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882593.36446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882593.36486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.36584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.38447: stdout chunk (state=3): >>>ansible-tmp-1726882593.3547757-30859-25875477139748=/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748 <<< 30529 1726882593.38585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.38603: stdout chunk (state=3): >>><<< 30529 1726882593.38615: stderr chunk (state=3): >>><<< 30529 1726882593.38635: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882593.3547757-30859-25875477139748=/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882593.38687: variable 'ansible_module_compression' from source: unknown 30529 1726882593.38749: ANSIBALLZ: Using lock for package_facts 30529 1726882593.38756: ANSIBALLZ: Acquiring lock 30529 1726882593.38762: ANSIBALLZ: Lock acquired: 139794688398096 30529 1726882593.38768: ANSIBALLZ: Creating module 30529 1726882593.60690: ANSIBALLZ: Writing module into payload 30529 1726882593.60777: ANSIBALLZ: Writing module 30529 1726882593.60802: ANSIBALLZ: Renaming module 30529 1726882593.60808: ANSIBALLZ: Done creating module 30529 1726882593.60831: variable 'ansible_facts' from source: unknown 30529 1726882593.60941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py 30529 1726882593.61040: Sending initial data 30529 1726882593.61044: Sent initial data (161 bytes) 30529 1726882593.61466: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882593.61469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.61472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882593.61474: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882593.61476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.61531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882593.61538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882593.61540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.61581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.63177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882593.63227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882593.63282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5amrpy_6 /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py <<< 30529 1726882593.63287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py" <<< 30529 1726882593.63326: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5amrpy_6" to remote "/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py" <<< 30529 1726882593.64862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.64895: stderr chunk (state=3): >>><<< 30529 1726882593.64916: stdout chunk (state=3): >>><<< 30529 1726882593.64934: done transferring module to remote 30529 1726882593.65018: _low_level_execute_command(): starting 30529 1726882593.65026: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/ /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py && sleep 0' 30529 1726882593.65598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882593.65700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882593.65721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882593.65740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.65820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882593.67578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882593.67590: stdout chunk (state=3): >>><<< 30529 1726882593.67604: stderr chunk (state=3): >>><<< 30529 1726882593.67621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882593.67629: _low_level_execute_command(): starting 30529 1726882593.67638: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/AnsiballZ_package_facts.py && sleep 0' 30529 1726882593.68228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882593.68250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882593.68264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882593.68281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882593.68359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882593.68400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882593.68416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882593.68436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882593.68521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882594.12064: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882594.12080: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30529 1726882594.12113: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30529 1726882594.12146: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30529 1726882594.12159: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30529 1726882594.12166: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30529 1726882594.12201: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882594.12235: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30529 1726882594.12239: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882594.12265: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30529 1726882594.12270: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882594.14016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882594.14048: stderr chunk (state=3): >>><<< 30529 1726882594.14051: stdout chunk (state=3): >>><<< 30529 1726882594.14089: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882594.15676: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882594.15695: _low_level_execute_command(): starting 30529 1726882594.15699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882593.3547757-30859-25875477139748/ > /dev/null 2>&1 && sleep 0' 30529 1726882594.16157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882594.16162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882594.16165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882594.16167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882594.16169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882594.16225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882594.16229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882594.16233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882594.16274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882594.18077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882594.18106: stderr chunk (state=3): >>><<< 30529 1726882594.18110: stdout chunk (state=3): >>><<< 30529 1726882594.18121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882594.18126: handler run complete 30529 1726882594.18565: variable 'ansible_facts' from source: unknown 30529 1726882594.18801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.19878: variable 'ansible_facts' from source: unknown 30529 1726882594.20106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.20482: attempt loop complete, returning result 30529 1726882594.20494: _execute() done 30529 1726882594.20498: dumping result to json 30529 1726882594.20611: done dumping result, returning 30529 1726882594.20619: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-00000000026f] 30529 1726882594.20623: sending task result for task 12673a56-9f93-b0f1-edc0-00000000026f 30529 1726882594.21798: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000026f 30529 1726882594.21801: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882594.21842: no more pending results, returning what we have 30529 1726882594.21844: results queue empty 30529 1726882594.21845: checking for any_errors_fatal 30529 1726882594.21849: done checking for any_errors_fatal 30529 1726882594.21849: checking for max_fail_percentage 30529 1726882594.21850: done checking for max_fail_percentage 30529 1726882594.21850: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.21851: done checking to see if all hosts have failed 30529 1726882594.21851: getting the remaining hosts for this loop 30529 1726882594.21852: done getting the remaining hosts for this loop 30529 1726882594.21855: getting the next task for host managed_node1 30529 1726882594.21860: done getting next task for host managed_node1 30529 1726882594.21862: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882594.21866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.21872: getting variables 30529 1726882594.21873: in VariableManager get_vars() 30529 1726882594.21897: Calling all_inventory to load vars for managed_node1 30529 1726882594.21899: Calling groups_inventory to load vars for managed_node1 30529 1726882594.21900: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.21907: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.21908: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.21910: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.22615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.23471: done with get_vars() 30529 1726882594.23490: done getting variables 30529 1726882594.23533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:34 -0400 (0:00:00.952) 0:00:08.261 ****** 30529 1726882594.23558: entering _queue_task() for managed_node1/debug 30529 1726882594.23770: worker is 1 (out of 1 available) 30529 1726882594.23783: exiting _queue_task() for managed_node1/debug 30529 1726882594.23798: done queuing things up, now waiting for results queue to drain 30529 1726882594.23800: waiting for pending results... 30529 1726882594.23962: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882594.24048: in run() - task 12673a56-9f93-b0f1-edc0-00000000020d 30529 1726882594.24060: variable 'ansible_search_path' from source: unknown 30529 1726882594.24063: variable 'ansible_search_path' from source: unknown 30529 1726882594.24096: calling self._execute() 30529 1726882594.24158: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.24162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.24170: variable 'omit' from source: magic vars 30529 1726882594.24431: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.24440: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.24446: variable 'omit' from source: magic vars 30529 1726882594.24487: variable 'omit' from source: magic vars 30529 1726882594.24555: variable 'network_provider' from source: set_fact 30529 1726882594.24574: variable 'omit' from source: magic vars 30529 1726882594.24603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882594.24629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882594.24644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882594.24657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882594.24667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882594.24696: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882594.24699: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.24702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.24768: Set connection var ansible_shell_executable to /bin/sh 30529 1726882594.24771: Set connection var ansible_pipelining to False 30529 1726882594.24773: Set connection var ansible_shell_type to sh 30529 1726882594.24783: Set connection var ansible_timeout to 10 30529 1726882594.24786: Set connection var ansible_connection to ssh 30529 1726882594.24798: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882594.24811: variable 'ansible_shell_executable' from source: unknown 30529 1726882594.24815: variable 'ansible_connection' from source: unknown 30529 1726882594.24818: variable 'ansible_module_compression' from source: unknown 30529 1726882594.24820: variable 'ansible_shell_type' from source: unknown 30529 1726882594.24823: variable 'ansible_shell_executable' from source: unknown 30529 1726882594.24825: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.24827: variable 'ansible_pipelining' from source: unknown 30529 1726882594.24830: variable 'ansible_timeout' from source: unknown 30529 1726882594.24835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.24932: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882594.24940: variable 'omit' from source: magic vars 30529 1726882594.24945: starting attempt loop 30529 1726882594.24948: running the handler 30529 1726882594.24980: handler run complete 30529 1726882594.24994: attempt loop complete, returning result 30529 1726882594.24997: _execute() done 30529 1726882594.25000: dumping result to json 30529 1726882594.25004: done dumping result, returning 30529 1726882594.25016: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-00000000020d] 30529 1726882594.25018: sending task result for task 12673a56-9f93-b0f1-edc0-00000000020d 30529 1726882594.25088: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000020d 30529 1726882594.25091: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882594.25168: no more pending results, returning what we have 30529 1726882594.25171: results queue empty 30529 1726882594.25172: checking for any_errors_fatal 30529 1726882594.25177: done checking for any_errors_fatal 30529 1726882594.25178: checking for max_fail_percentage 30529 1726882594.25179: done checking for max_fail_percentage 30529 1726882594.25180: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.25181: done checking to see if all hosts have failed 30529 1726882594.25182: getting the remaining hosts for this loop 30529 1726882594.25183: done getting the remaining hosts for this loop 30529 1726882594.25186: getting the next task for host managed_node1 30529 1726882594.25194: done getting next task for host managed_node1 30529 1726882594.25197: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882594.25202: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.25211: getting variables 30529 1726882594.25213: in VariableManager get_vars() 30529 1726882594.25242: Calling all_inventory to load vars for managed_node1 30529 1726882594.25244: Calling groups_inventory to load vars for managed_node1 30529 1726882594.25246: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.25254: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.25256: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.25258: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.25960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.26809: done with get_vars() 30529 1726882594.26825: done getting variables 30529 1726882594.26884: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:34 -0400 (0:00:00.033) 0:00:08.295 ****** 30529 1726882594.26914: entering _queue_task() for managed_node1/fail 30529 1726882594.26915: Creating lock for fail 30529 1726882594.27103: worker is 1 (out of 1 available) 30529 1726882594.27116: exiting _queue_task() for managed_node1/fail 30529 1726882594.27127: done queuing things up, now waiting for results queue to drain 30529 1726882594.27129: waiting for pending results... 30529 1726882594.27284: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882594.27367: in run() - task 12673a56-9f93-b0f1-edc0-00000000020e 30529 1726882594.27377: variable 'ansible_search_path' from source: unknown 30529 1726882594.27380: variable 'ansible_search_path' from source: unknown 30529 1726882594.27411: calling self._execute() 30529 1726882594.27466: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.27476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.27484: variable 'omit' from source: magic vars 30529 1726882594.27746: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.27755: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.27838: variable 'network_state' from source: role '' defaults 30529 1726882594.27846: Evaluated conditional (network_state != {}): False 30529 1726882594.27850: when evaluation is False, skipping this task 30529 1726882594.27852: _execute() done 30529 1726882594.27855: dumping result to json 30529 1726882594.27857: done dumping result, returning 30529 1726882594.27865: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-00000000020e] 30529 1726882594.27869: sending task result for task 12673a56-9f93-b0f1-edc0-00000000020e 30529 1726882594.27955: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000020e 30529 1726882594.27958: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882594.28011: no more pending results, returning what we have 30529 1726882594.28015: results queue empty 30529 1726882594.28016: checking for any_errors_fatal 30529 1726882594.28021: done checking for any_errors_fatal 30529 1726882594.28022: checking for max_fail_percentage 30529 1726882594.28023: done checking for max_fail_percentage 30529 1726882594.28024: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.28025: done checking to see if all hosts have failed 30529 1726882594.28025: getting the remaining hosts for this loop 30529 1726882594.28027: done getting the remaining hosts for this loop 30529 1726882594.28030: getting the next task for host managed_node1 30529 1726882594.28036: done getting next task for host managed_node1 30529 1726882594.28039: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882594.28044: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.28057: getting variables 30529 1726882594.28058: in VariableManager get_vars() 30529 1726882594.28095: Calling all_inventory to load vars for managed_node1 30529 1726882594.28097: Calling groups_inventory to load vars for managed_node1 30529 1726882594.28099: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.28105: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.28107: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.28108: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.28888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.29733: done with get_vars() 30529 1726882594.29746: done getting variables 30529 1726882594.29783: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:34 -0400 (0:00:00.028) 0:00:08.324 ****** 30529 1726882594.29810: entering _queue_task() for managed_node1/fail 30529 1726882594.29988: worker is 1 (out of 1 available) 30529 1726882594.30001: exiting _queue_task() for managed_node1/fail 30529 1726882594.30015: done queuing things up, now waiting for results queue to drain 30529 1726882594.30017: waiting for pending results... 30529 1726882594.30160: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882594.30233: in run() - task 12673a56-9f93-b0f1-edc0-00000000020f 30529 1726882594.30247: variable 'ansible_search_path' from source: unknown 30529 1726882594.30251: variable 'ansible_search_path' from source: unknown 30529 1726882594.30274: calling self._execute() 30529 1726882594.30331: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.30334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.30343: variable 'omit' from source: magic vars 30529 1726882594.30578: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.30591: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.30666: variable 'network_state' from source: role '' defaults 30529 1726882594.30675: Evaluated conditional (network_state != {}): False 30529 1726882594.30678: when evaluation is False, skipping this task 30529 1726882594.30681: _execute() done 30529 1726882594.30683: dumping result to json 30529 1726882594.30688: done dumping result, returning 30529 1726882594.30692: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-00000000020f] 30529 1726882594.30706: sending task result for task 12673a56-9f93-b0f1-edc0-00000000020f 30529 1726882594.30775: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000020f 30529 1726882594.30777: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882594.30843: no more pending results, returning what we have 30529 1726882594.30846: results queue empty 30529 1726882594.30847: checking for any_errors_fatal 30529 1726882594.30853: done checking for any_errors_fatal 30529 1726882594.30854: checking for max_fail_percentage 30529 1726882594.30855: done checking for max_fail_percentage 30529 1726882594.30856: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.30857: done checking to see if all hosts have failed 30529 1726882594.30857: getting the remaining hosts for this loop 30529 1726882594.30858: done getting the remaining hosts for this loop 30529 1726882594.30861: getting the next task for host managed_node1 30529 1726882594.30867: done getting next task for host managed_node1 30529 1726882594.30870: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882594.30875: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.30887: getting variables 30529 1726882594.30888: in VariableManager get_vars() 30529 1726882594.30912: Calling all_inventory to load vars for managed_node1 30529 1726882594.30914: Calling groups_inventory to load vars for managed_node1 30529 1726882594.30916: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.30923: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.30924: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.30926: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.31620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.32551: done with get_vars() 30529 1726882594.32565: done getting variables 30529 1726882594.32607: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:34 -0400 (0:00:00.028) 0:00:08.352 ****** 30529 1726882594.32627: entering _queue_task() for managed_node1/fail 30529 1726882594.32807: worker is 1 (out of 1 available) 30529 1726882594.32820: exiting _queue_task() for managed_node1/fail 30529 1726882594.32831: done queuing things up, now waiting for results queue to drain 30529 1726882594.32833: waiting for pending results... 30529 1726882594.32980: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882594.33053: in run() - task 12673a56-9f93-b0f1-edc0-000000000210 30529 1726882594.33063: variable 'ansible_search_path' from source: unknown 30529 1726882594.33067: variable 'ansible_search_path' from source: unknown 30529 1726882594.33101: calling self._execute() 30529 1726882594.33151: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.33155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.33163: variable 'omit' from source: magic vars 30529 1726882594.33399: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.33409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.33521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.34951: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.35004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.35034: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.35062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.35083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.35141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.35163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.35182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.35212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.35224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.35289: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.35305: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882594.35376: variable 'ansible_distribution' from source: facts 30529 1726882594.35379: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.35390: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882594.35546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.35564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.35581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.35614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.35625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.35657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.35674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.35718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.35721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.35733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.35761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.35778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.35797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.35826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.35833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.36048: variable 'network_connections' from source: include params 30529 1726882594.36056: variable 'interface' from source: play vars 30529 1726882594.36106: variable 'interface' from source: play vars 30529 1726882594.36117: variable 'network_state' from source: role '' defaults 30529 1726882594.36161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.36270: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.36297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.36320: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.36342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.36373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.36391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.36413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.36431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.36457: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882594.36461: when evaluation is False, skipping this task 30529 1726882594.36463: _execute() done 30529 1726882594.36466: dumping result to json 30529 1726882594.36468: done dumping result, returning 30529 1726882594.36479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000000210] 30529 1726882594.36481: sending task result for task 12673a56-9f93-b0f1-edc0-000000000210 30529 1726882594.36555: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000210 30529 1726882594.36558: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882594.36605: no more pending results, returning what we have 30529 1726882594.36608: results queue empty 30529 1726882594.36609: checking for any_errors_fatal 30529 1726882594.36614: done checking for any_errors_fatal 30529 1726882594.36615: checking for max_fail_percentage 30529 1726882594.36616: done checking for max_fail_percentage 30529 1726882594.36617: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.36618: done checking to see if all hosts have failed 30529 1726882594.36619: getting the remaining hosts for this loop 30529 1726882594.36620: done getting the remaining hosts for this loop 30529 1726882594.36623: getting the next task for host managed_node1 30529 1726882594.36630: done getting next task for host managed_node1 30529 1726882594.36634: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882594.36638: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.36650: getting variables 30529 1726882594.36651: in VariableManager get_vars() 30529 1726882594.36677: Calling all_inventory to load vars for managed_node1 30529 1726882594.36679: Calling groups_inventory to load vars for managed_node1 30529 1726882594.36681: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.36691: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.36695: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.36698: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.37423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.38767: done with get_vars() 30529 1726882594.38781: done getting variables 30529 1726882594.38847: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:34 -0400 (0:00:00.062) 0:00:08.414 ****** 30529 1726882594.38868: entering _queue_task() for managed_node1/dnf 30529 1726882594.39053: worker is 1 (out of 1 available) 30529 1726882594.39065: exiting _queue_task() for managed_node1/dnf 30529 1726882594.39077: done queuing things up, now waiting for results queue to drain 30529 1726882594.39078: waiting for pending results... 30529 1726882594.39243: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882594.39323: in run() - task 12673a56-9f93-b0f1-edc0-000000000211 30529 1726882594.39334: variable 'ansible_search_path' from source: unknown 30529 1726882594.39337: variable 'ansible_search_path' from source: unknown 30529 1726882594.39364: calling self._execute() 30529 1726882594.39429: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.39432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.39440: variable 'omit' from source: magic vars 30529 1726882594.39682: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.39695: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.39823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.41860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.42008: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.42011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.42025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.42057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.42137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.42169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.42203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.42248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.42265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.42375: variable 'ansible_distribution' from source: facts 30529 1726882594.42384: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.42498: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882594.42518: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.42648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.42675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.42707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.42749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.42766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.42814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.42841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.42868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.42913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.42930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.42970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.43003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.43030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.43070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.43090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.43243: variable 'network_connections' from source: include params 30529 1726882594.43497: variable 'interface' from source: play vars 30529 1726882594.43501: variable 'interface' from source: play vars 30529 1726882594.43503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.43561: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.43604: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.43639: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.43670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.43718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.43745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.43783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.43817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.43872: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882594.44419: variable 'network_connections' from source: include params 30529 1726882594.44429: variable 'interface' from source: play vars 30529 1726882594.44495: variable 'interface' from source: play vars 30529 1726882594.44530: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882594.44538: when evaluation is False, skipping this task 30529 1726882594.44544: _execute() done 30529 1726882594.44550: dumping result to json 30529 1726882594.44557: done dumping result, returning 30529 1726882594.44567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000211] 30529 1726882594.44578: sending task result for task 12673a56-9f93-b0f1-edc0-000000000211 30529 1726882594.44676: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000211 30529 1726882594.44683: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882594.44733: no more pending results, returning what we have 30529 1726882594.44736: results queue empty 30529 1726882594.44737: checking for any_errors_fatal 30529 1726882594.44741: done checking for any_errors_fatal 30529 1726882594.44741: checking for max_fail_percentage 30529 1726882594.44743: done checking for max_fail_percentage 30529 1726882594.44743: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.44745: done checking to see if all hosts have failed 30529 1726882594.44745: getting the remaining hosts for this loop 30529 1726882594.44747: done getting the remaining hosts for this loop 30529 1726882594.44750: getting the next task for host managed_node1 30529 1726882594.44758: done getting next task for host managed_node1 30529 1726882594.44762: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882594.44766: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.44779: getting variables 30529 1726882594.44780: in VariableManager get_vars() 30529 1726882594.44818: Calling all_inventory to load vars for managed_node1 30529 1726882594.44821: Calling groups_inventory to load vars for managed_node1 30529 1726882594.44823: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.44832: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.44835: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.44837: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.46324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.47901: done with get_vars() 30529 1726882594.47921: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882594.47990: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:34 -0400 (0:00:00.091) 0:00:08.506 ****** 30529 1726882594.48025: entering _queue_task() for managed_node1/yum 30529 1726882594.48027: Creating lock for yum 30529 1726882594.48304: worker is 1 (out of 1 available) 30529 1726882594.48317: exiting _queue_task() for managed_node1/yum 30529 1726882594.48329: done queuing things up, now waiting for results queue to drain 30529 1726882594.48330: waiting for pending results... 30529 1726882594.48599: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882594.48732: in run() - task 12673a56-9f93-b0f1-edc0-000000000212 30529 1726882594.48750: variable 'ansible_search_path' from source: unknown 30529 1726882594.48758: variable 'ansible_search_path' from source: unknown 30529 1726882594.48799: calling self._execute() 30529 1726882594.48888: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.48902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.48917: variable 'omit' from source: magic vars 30529 1726882594.49264: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.49280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.49451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.51674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.51756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.51870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.51873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.51875: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.51944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.51978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.52017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.52059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.52077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.52176: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.52204: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882594.52215: when evaluation is False, skipping this task 30529 1726882594.52222: _execute() done 30529 1726882594.52228: dumping result to json 30529 1726882594.52235: done dumping result, returning 30529 1726882594.52300: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000212] 30529 1726882594.52303: sending task result for task 12673a56-9f93-b0f1-edc0-000000000212 30529 1726882594.52372: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000212 30529 1726882594.52375: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882594.52429: no more pending results, returning what we have 30529 1726882594.52433: results queue empty 30529 1726882594.52434: checking for any_errors_fatal 30529 1726882594.52439: done checking for any_errors_fatal 30529 1726882594.52440: checking for max_fail_percentage 30529 1726882594.52442: done checking for max_fail_percentage 30529 1726882594.52442: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.52443: done checking to see if all hosts have failed 30529 1726882594.52444: getting the remaining hosts for this loop 30529 1726882594.52446: done getting the remaining hosts for this loop 30529 1726882594.52449: getting the next task for host managed_node1 30529 1726882594.52458: done getting next task for host managed_node1 30529 1726882594.52462: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882594.52468: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.52482: getting variables 30529 1726882594.52484: in VariableManager get_vars() 30529 1726882594.52523: Calling all_inventory to load vars for managed_node1 30529 1726882594.52526: Calling groups_inventory to load vars for managed_node1 30529 1726882594.52528: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.52539: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.52542: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.52545: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.54195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.55572: done with get_vars() 30529 1726882594.55600: done getting variables 30529 1726882594.55655: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:34 -0400 (0:00:00.076) 0:00:08.582 ****** 30529 1726882594.55691: entering _queue_task() for managed_node1/fail 30529 1726882594.56000: worker is 1 (out of 1 available) 30529 1726882594.56016: exiting _queue_task() for managed_node1/fail 30529 1726882594.56028: done queuing things up, now waiting for results queue to drain 30529 1726882594.56029: waiting for pending results... 30529 1726882594.56415: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882594.56440: in run() - task 12673a56-9f93-b0f1-edc0-000000000213 30529 1726882594.56461: variable 'ansible_search_path' from source: unknown 30529 1726882594.56469: variable 'ansible_search_path' from source: unknown 30529 1726882594.56515: calling self._execute() 30529 1726882594.56611: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.56632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.56650: variable 'omit' from source: magic vars 30529 1726882594.57026: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.57045: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.57183: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.57389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.59844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.59996: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.60000: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.60015: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.60048: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.60132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.60167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.60205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.60253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.60273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.60336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.60364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.60397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.60446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.60539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.60542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.60544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.60567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.60612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.60630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.60806: variable 'network_connections' from source: include params 30529 1726882594.60823: variable 'interface' from source: play vars 30529 1726882594.60902: variable 'interface' from source: play vars 30529 1726882594.60977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.61149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.61197: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.61233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.61266: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.61318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.61408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.61411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.61413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.61467: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882594.61702: variable 'network_connections' from source: include params 30529 1726882594.61713: variable 'interface' from source: play vars 30529 1726882594.61776: variable 'interface' from source: play vars 30529 1726882594.61817: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882594.61827: when evaluation is False, skipping this task 30529 1726882594.61834: _execute() done 30529 1726882594.61844: dumping result to json 30529 1726882594.61851: done dumping result, returning 30529 1726882594.61862: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000213] 30529 1726882594.61869: sending task result for task 12673a56-9f93-b0f1-edc0-000000000213 30529 1726882594.62027: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000213 30529 1726882594.62030: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882594.62113: no more pending results, returning what we have 30529 1726882594.62116: results queue empty 30529 1726882594.62117: checking for any_errors_fatal 30529 1726882594.62123: done checking for any_errors_fatal 30529 1726882594.62124: checking for max_fail_percentage 30529 1726882594.62126: done checking for max_fail_percentage 30529 1726882594.62127: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.62128: done checking to see if all hosts have failed 30529 1726882594.62128: getting the remaining hosts for this loop 30529 1726882594.62130: done getting the remaining hosts for this loop 30529 1726882594.62134: getting the next task for host managed_node1 30529 1726882594.62142: done getting next task for host managed_node1 30529 1726882594.62146: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882594.62151: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.62164: getting variables 30529 1726882594.62166: in VariableManager get_vars() 30529 1726882594.62206: Calling all_inventory to load vars for managed_node1 30529 1726882594.62209: Calling groups_inventory to load vars for managed_node1 30529 1726882594.62212: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.62223: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.62226: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.62229: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.63849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.65435: done with get_vars() 30529 1726882594.65458: done getting variables 30529 1726882594.65525: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:34 -0400 (0:00:00.098) 0:00:08.681 ****** 30529 1726882594.65561: entering _queue_task() for managed_node1/package 30529 1726882594.65864: worker is 1 (out of 1 available) 30529 1726882594.65878: exiting _queue_task() for managed_node1/package 30529 1726882594.65892: done queuing things up, now waiting for results queue to drain 30529 1726882594.65895: waiting for pending results... 30529 1726882594.66222: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882594.66323: in run() - task 12673a56-9f93-b0f1-edc0-000000000214 30529 1726882594.66328: variable 'ansible_search_path' from source: unknown 30529 1726882594.66330: variable 'ansible_search_path' from source: unknown 30529 1726882594.66354: calling self._execute() 30529 1726882594.66445: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.66499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.66503: variable 'omit' from source: magic vars 30529 1726882594.66835: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.66852: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.67040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.67279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.67317: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.67341: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.67365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.67455: variable 'network_packages' from source: role '' defaults 30529 1726882594.67528: variable '__network_provider_setup' from source: role '' defaults 30529 1726882594.67536: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882594.67584: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882594.67590: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882594.67636: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882594.67748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.69063: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.69103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.69128: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.69162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.69182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.69255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.69498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.69502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.69512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.69515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.69517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.69519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.69521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.69523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.69525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.69623: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882594.69691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.69710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.69728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.69752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.69763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.69822: variable 'ansible_python' from source: facts 30529 1726882594.69837: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882594.69890: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882594.69946: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882594.70022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.70038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.70057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.70081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.70092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.70124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.70144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.70162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.70188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.70198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.70498: variable 'network_connections' from source: include params 30529 1726882594.70502: variable 'interface' from source: play vars 30529 1726882594.70504: variable 'interface' from source: play vars 30529 1726882594.70507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.70510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.70515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.70545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.70588: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.70864: variable 'network_connections' from source: include params 30529 1726882594.70868: variable 'interface' from source: play vars 30529 1726882594.70941: variable 'interface' from source: play vars 30529 1726882594.70977: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882594.71031: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.71221: variable 'network_connections' from source: include params 30529 1726882594.71225: variable 'interface' from source: play vars 30529 1726882594.71271: variable 'interface' from source: play vars 30529 1726882594.71291: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882594.71344: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882594.71534: variable 'network_connections' from source: include params 30529 1726882594.71537: variable 'interface' from source: play vars 30529 1726882594.71582: variable 'interface' from source: play vars 30529 1726882594.71625: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882594.71665: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882594.71671: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882594.71716: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882594.71847: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882594.72135: variable 'network_connections' from source: include params 30529 1726882594.72139: variable 'interface' from source: play vars 30529 1726882594.72179: variable 'interface' from source: play vars 30529 1726882594.72188: variable 'ansible_distribution' from source: facts 30529 1726882594.72191: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.72196: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.72217: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882594.72322: variable 'ansible_distribution' from source: facts 30529 1726882594.72325: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.72329: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.72338: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882594.72442: variable 'ansible_distribution' from source: facts 30529 1726882594.72446: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.72449: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.72476: variable 'network_provider' from source: set_fact 30529 1726882594.72490: variable 'ansible_facts' from source: unknown 30529 1726882594.72899: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882594.72903: when evaluation is False, skipping this task 30529 1726882594.72906: _execute() done 30529 1726882594.72908: dumping result to json 30529 1726882594.72911: done dumping result, returning 30529 1726882594.72919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000000214] 30529 1726882594.72922: sending task result for task 12673a56-9f93-b0f1-edc0-000000000214 30529 1726882594.73011: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000214 30529 1726882594.73014: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882594.73061: no more pending results, returning what we have 30529 1726882594.73065: results queue empty 30529 1726882594.73066: checking for any_errors_fatal 30529 1726882594.73071: done checking for any_errors_fatal 30529 1726882594.73072: checking for max_fail_percentage 30529 1726882594.73073: done checking for max_fail_percentage 30529 1726882594.73074: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.73074: done checking to see if all hosts have failed 30529 1726882594.73075: getting the remaining hosts for this loop 30529 1726882594.73077: done getting the remaining hosts for this loop 30529 1726882594.73081: getting the next task for host managed_node1 30529 1726882594.73091: done getting next task for host managed_node1 30529 1726882594.73096: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882594.73101: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.73114: getting variables 30529 1726882594.73115: in VariableManager get_vars() 30529 1726882594.73158: Calling all_inventory to load vars for managed_node1 30529 1726882594.73160: Calling groups_inventory to load vars for managed_node1 30529 1726882594.73162: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.73171: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.73173: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.73176: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.73929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.74864: done with get_vars() 30529 1726882594.74879: done getting variables 30529 1726882594.74923: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:34 -0400 (0:00:00.093) 0:00:08.775 ****** 30529 1726882594.74947: entering _queue_task() for managed_node1/package 30529 1726882594.75159: worker is 1 (out of 1 available) 30529 1726882594.75174: exiting _queue_task() for managed_node1/package 30529 1726882594.75190: done queuing things up, now waiting for results queue to drain 30529 1726882594.75192: waiting for pending results... 30529 1726882594.75354: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882594.75440: in run() - task 12673a56-9f93-b0f1-edc0-000000000215 30529 1726882594.75452: variable 'ansible_search_path' from source: unknown 30529 1726882594.75455: variable 'ansible_search_path' from source: unknown 30529 1726882594.75481: calling self._execute() 30529 1726882594.75555: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.75559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.75566: variable 'omit' from source: magic vars 30529 1726882594.75817: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.75826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.75909: variable 'network_state' from source: role '' defaults 30529 1726882594.75916: Evaluated conditional (network_state != {}): False 30529 1726882594.75920: when evaluation is False, skipping this task 30529 1726882594.75922: _execute() done 30529 1726882594.75925: dumping result to json 30529 1726882594.75927: done dumping result, returning 30529 1726882594.75936: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000215] 30529 1726882594.75944: sending task result for task 12673a56-9f93-b0f1-edc0-000000000215 30529 1726882594.76037: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000215 30529 1726882594.76040: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882594.76112: no more pending results, returning what we have 30529 1726882594.76115: results queue empty 30529 1726882594.76116: checking for any_errors_fatal 30529 1726882594.76119: done checking for any_errors_fatal 30529 1726882594.76120: checking for max_fail_percentage 30529 1726882594.76122: done checking for max_fail_percentage 30529 1726882594.76122: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.76123: done checking to see if all hosts have failed 30529 1726882594.76124: getting the remaining hosts for this loop 30529 1726882594.76125: done getting the remaining hosts for this loop 30529 1726882594.76128: getting the next task for host managed_node1 30529 1726882594.76134: done getting next task for host managed_node1 30529 1726882594.76137: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882594.76141: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.76153: getting variables 30529 1726882594.76154: in VariableManager get_vars() 30529 1726882594.76180: Calling all_inventory to load vars for managed_node1 30529 1726882594.76183: Calling groups_inventory to load vars for managed_node1 30529 1726882594.76185: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.76196: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.76198: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.76200: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.79584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.80416: done with get_vars() 30529 1726882594.80430: done getting variables 30529 1726882594.80463: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:34 -0400 (0:00:00.055) 0:00:08.830 ****** 30529 1726882594.80482: entering _queue_task() for managed_node1/package 30529 1726882594.80714: worker is 1 (out of 1 available) 30529 1726882594.80727: exiting _queue_task() for managed_node1/package 30529 1726882594.80737: done queuing things up, now waiting for results queue to drain 30529 1726882594.80739: waiting for pending results... 30529 1726882594.80902: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882594.80992: in run() - task 12673a56-9f93-b0f1-edc0-000000000216 30529 1726882594.81006: variable 'ansible_search_path' from source: unknown 30529 1726882594.81009: variable 'ansible_search_path' from source: unknown 30529 1726882594.81034: calling self._execute() 30529 1726882594.81100: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.81107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.81116: variable 'omit' from source: magic vars 30529 1726882594.81373: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.81382: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.81465: variable 'network_state' from source: role '' defaults 30529 1726882594.81473: Evaluated conditional (network_state != {}): False 30529 1726882594.81476: when evaluation is False, skipping this task 30529 1726882594.81481: _execute() done 30529 1726882594.81484: dumping result to json 30529 1726882594.81489: done dumping result, returning 30529 1726882594.81495: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000216] 30529 1726882594.81498: sending task result for task 12673a56-9f93-b0f1-edc0-000000000216 30529 1726882594.81588: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000216 30529 1726882594.81591: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882594.81652: no more pending results, returning what we have 30529 1726882594.81655: results queue empty 30529 1726882594.81656: checking for any_errors_fatal 30529 1726882594.81663: done checking for any_errors_fatal 30529 1726882594.81664: checking for max_fail_percentage 30529 1726882594.81666: done checking for max_fail_percentage 30529 1726882594.81666: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.81667: done checking to see if all hosts have failed 30529 1726882594.81668: getting the remaining hosts for this loop 30529 1726882594.81670: done getting the remaining hosts for this loop 30529 1726882594.81673: getting the next task for host managed_node1 30529 1726882594.81680: done getting next task for host managed_node1 30529 1726882594.81682: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882594.81690: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.81706: getting variables 30529 1726882594.81707: in VariableManager get_vars() 30529 1726882594.81733: Calling all_inventory to load vars for managed_node1 30529 1726882594.81735: Calling groups_inventory to load vars for managed_node1 30529 1726882594.81737: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.81744: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.81746: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.81749: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.82454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.83384: done with get_vars() 30529 1726882594.83403: done getting variables 30529 1726882594.83465: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:34 -0400 (0:00:00.030) 0:00:08.860 ****** 30529 1726882594.83488: entering _queue_task() for managed_node1/service 30529 1726882594.83489: Creating lock for service 30529 1726882594.83685: worker is 1 (out of 1 available) 30529 1726882594.83702: exiting _queue_task() for managed_node1/service 30529 1726882594.83714: done queuing things up, now waiting for results queue to drain 30529 1726882594.83716: waiting for pending results... 30529 1726882594.83859: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882594.83944: in run() - task 12673a56-9f93-b0f1-edc0-000000000217 30529 1726882594.83951: variable 'ansible_search_path' from source: unknown 30529 1726882594.83954: variable 'ansible_search_path' from source: unknown 30529 1726882594.83979: calling self._execute() 30529 1726882594.84039: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.84042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.84051: variable 'omit' from source: magic vars 30529 1726882594.84313: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.84322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.84407: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.84531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.85949: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.86005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.86035: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.86061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.86081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.86140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.86161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.86178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.86208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.86218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.86252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.86268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.86286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.86315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.86326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.86355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.86371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.86387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.86416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.86426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.86536: variable 'network_connections' from source: include params 30529 1726882594.86544: variable 'interface' from source: play vars 30529 1726882594.86596: variable 'interface' from source: play vars 30529 1726882594.86641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.86755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.86783: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.86809: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.86832: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.86861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.86877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.86900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.86918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.86959: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882594.87105: variable 'network_connections' from source: include params 30529 1726882594.87109: variable 'interface' from source: play vars 30529 1726882594.87151: variable 'interface' from source: play vars 30529 1726882594.87173: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882594.87177: when evaluation is False, skipping this task 30529 1726882594.87179: _execute() done 30529 1726882594.87182: dumping result to json 30529 1726882594.87184: done dumping result, returning 30529 1726882594.87195: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000217] 30529 1726882594.87198: sending task result for task 12673a56-9f93-b0f1-edc0-000000000217 30529 1726882594.87276: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000217 30529 1726882594.87285: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882594.87353: no more pending results, returning what we have 30529 1726882594.87355: results queue empty 30529 1726882594.87356: checking for any_errors_fatal 30529 1726882594.87364: done checking for any_errors_fatal 30529 1726882594.87365: checking for max_fail_percentage 30529 1726882594.87366: done checking for max_fail_percentage 30529 1726882594.87367: checking to see if all hosts have failed and the running result is not ok 30529 1726882594.87368: done checking to see if all hosts have failed 30529 1726882594.87368: getting the remaining hosts for this loop 30529 1726882594.87370: done getting the remaining hosts for this loop 30529 1726882594.87373: getting the next task for host managed_node1 30529 1726882594.87380: done getting next task for host managed_node1 30529 1726882594.87383: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882594.87388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882594.87401: getting variables 30529 1726882594.87402: in VariableManager get_vars() 30529 1726882594.87429: Calling all_inventory to load vars for managed_node1 30529 1726882594.87431: Calling groups_inventory to load vars for managed_node1 30529 1726882594.87433: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882594.87440: Calling all_plugins_play to load vars for managed_node1 30529 1726882594.87443: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882594.87445: Calling groups_plugins_play to load vars for managed_node1 30529 1726882594.88175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882594.89046: done with get_vars() 30529 1726882594.89060: done getting variables 30529 1726882594.89106: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:34 -0400 (0:00:00.056) 0:00:08.917 ****** 30529 1726882594.89128: entering _queue_task() for managed_node1/service 30529 1726882594.89345: worker is 1 (out of 1 available) 30529 1726882594.89357: exiting _queue_task() for managed_node1/service 30529 1726882594.89370: done queuing things up, now waiting for results queue to drain 30529 1726882594.89372: waiting for pending results... 30529 1726882594.89538: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882594.89624: in run() - task 12673a56-9f93-b0f1-edc0-000000000218 30529 1726882594.89635: variable 'ansible_search_path' from source: unknown 30529 1726882594.89638: variable 'ansible_search_path' from source: unknown 30529 1726882594.89666: calling self._execute() 30529 1726882594.89729: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.89733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.89742: variable 'omit' from source: magic vars 30529 1726882594.90001: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.90012: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882594.90120: variable 'network_provider' from source: set_fact 30529 1726882594.90124: variable 'network_state' from source: role '' defaults 30529 1726882594.90131: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882594.90139: variable 'omit' from source: magic vars 30529 1726882594.90176: variable 'omit' from source: magic vars 30529 1726882594.90198: variable 'network_service_name' from source: role '' defaults 30529 1726882594.90247: variable 'network_service_name' from source: role '' defaults 30529 1726882594.90319: variable '__network_provider_setup' from source: role '' defaults 30529 1726882594.90322: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882594.90367: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882594.90374: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882594.90421: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882594.90563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882594.91969: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882594.92253: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882594.92282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882594.92308: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882594.92331: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882594.92385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.92408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.92426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.92455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.92466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.92499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.92515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.92531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.92559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.92569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.92898: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882594.92902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.92904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.92927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.92971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.92997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.93096: variable 'ansible_python' from source: facts 30529 1726882594.93117: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882594.93209: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882594.93304: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882594.93445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.93485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.93522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.93573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.93678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.93682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882594.93699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882594.93730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.93774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882594.93809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882594.93958: variable 'network_connections' from source: include params 30529 1726882594.93971: variable 'interface' from source: play vars 30529 1726882594.94064: variable 'interface' from source: play vars 30529 1726882594.94182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882594.94384: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882594.94501: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882594.94520: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882594.94573: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882594.94652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882594.94698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882594.94766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882594.94796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882594.94852: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.95176: variable 'network_connections' from source: include params 30529 1726882594.95208: variable 'interface' from source: play vars 30529 1726882594.95309: variable 'interface' from source: play vars 30529 1726882594.95354: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882594.95452: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882594.95772: variable 'network_connections' from source: include params 30529 1726882594.95857: variable 'interface' from source: play vars 30529 1726882594.95871: variable 'interface' from source: play vars 30529 1726882594.95905: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882594.96008: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882594.96344: variable 'network_connections' from source: include params 30529 1726882594.96356: variable 'interface' from source: play vars 30529 1726882594.96445: variable 'interface' from source: play vars 30529 1726882594.96698: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882594.96704: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882594.96707: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882594.96709: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882594.96913: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882594.97482: variable 'network_connections' from source: include params 30529 1726882594.97500: variable 'interface' from source: play vars 30529 1726882594.97562: variable 'interface' from source: play vars 30529 1726882594.97576: variable 'ansible_distribution' from source: facts 30529 1726882594.97598: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.97609: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.97644: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882594.97959: variable 'ansible_distribution' from source: facts 30529 1726882594.97962: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.97964: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.97966: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882594.98210: variable 'ansible_distribution' from source: facts 30529 1726882594.98220: variable '__network_rh_distros' from source: role '' defaults 30529 1726882594.98229: variable 'ansible_distribution_major_version' from source: facts 30529 1726882594.98277: variable 'network_provider' from source: set_fact 30529 1726882594.98315: variable 'omit' from source: magic vars 30529 1726882594.98602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882594.98606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882594.98609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882594.98611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882594.98622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882594.98655: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882594.98665: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.98673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.98903: Set connection var ansible_shell_executable to /bin/sh 30529 1726882594.98935: Set connection var ansible_pipelining to False 30529 1726882594.98944: Set connection var ansible_shell_type to sh 30529 1726882594.99016: Set connection var ansible_timeout to 10 30529 1726882594.99023: Set connection var ansible_connection to ssh 30529 1726882594.99039: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882594.99068: variable 'ansible_shell_executable' from source: unknown 30529 1726882594.99143: variable 'ansible_connection' from source: unknown 30529 1726882594.99146: variable 'ansible_module_compression' from source: unknown 30529 1726882594.99149: variable 'ansible_shell_type' from source: unknown 30529 1726882594.99151: variable 'ansible_shell_executable' from source: unknown 30529 1726882594.99153: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882594.99155: variable 'ansible_pipelining' from source: unknown 30529 1726882594.99163: variable 'ansible_timeout' from source: unknown 30529 1726882594.99171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882594.99470: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882594.99481: variable 'omit' from source: magic vars 30529 1726882594.99498: starting attempt loop 30529 1726882594.99689: running the handler 30529 1726882594.99694: variable 'ansible_facts' from source: unknown 30529 1726882595.00663: _low_level_execute_command(): starting 30529 1726882595.00676: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882595.01519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882595.01545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882595.01560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882595.01583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.01673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.03372: stdout chunk (state=3): >>>/root <<< 30529 1726882595.03530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882595.03534: stdout chunk (state=3): >>><<< 30529 1726882595.03536: stderr chunk (state=3): >>><<< 30529 1726882595.03554: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882595.03663: _low_level_execute_command(): starting 30529 1726882595.03667: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351 `" && echo ansible-tmp-1726882595.0357258-30937-147466470945351="` echo /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351 `" ) && sleep 0' 30529 1726882595.04441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882595.04456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882595.04471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882595.04499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882595.04518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.04590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.06469: stdout chunk (state=3): >>>ansible-tmp-1726882595.0357258-30937-147466470945351=/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351 <<< 30529 1726882595.06619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882595.06634: stdout chunk (state=3): >>><<< 30529 1726882595.06647: stderr chunk (state=3): >>><<< 30529 1726882595.06668: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882595.0357258-30937-147466470945351=/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882595.06707: variable 'ansible_module_compression' from source: unknown 30529 1726882595.06799: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 30529 1726882595.06803: ANSIBALLZ: Acquiring lock 30529 1726882595.06805: ANSIBALLZ: Lock acquired: 139794692461328 30529 1726882595.06808: ANSIBALLZ: Creating module 30529 1726882595.35337: ANSIBALLZ: Writing module into payload 30529 1726882595.35700: ANSIBALLZ: Writing module 30529 1726882595.35704: ANSIBALLZ: Renaming module 30529 1726882595.35706: ANSIBALLZ: Done creating module 30529 1726882595.35707: variable 'ansible_facts' from source: unknown 30529 1726882595.35804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py 30529 1726882595.35951: Sending initial data 30529 1726882595.36052: Sent initial data (156 bytes) 30529 1726882595.36617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882595.36633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882595.36649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882595.36672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882595.36711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882595.36728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882595.36814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882595.36861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.36909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.38551: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882595.38611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882595.38680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpy16gic9o /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py <<< 30529 1726882595.38701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py" <<< 30529 1726882595.38736: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpy16gic9o" to remote "/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py" <<< 30529 1726882595.40438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882595.40442: stdout chunk (state=3): >>><<< 30529 1726882595.40444: stderr chunk (state=3): >>><<< 30529 1726882595.40446: done transferring module to remote 30529 1726882595.40448: _low_level_execute_command(): starting 30529 1726882595.40450: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/ /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py && sleep 0' 30529 1726882595.41036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882595.41040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882595.41053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882595.41141: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882595.41159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882595.41174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.41247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.43038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882595.43052: stdout chunk (state=3): >>><<< 30529 1726882595.43062: stderr chunk (state=3): >>><<< 30529 1726882595.43078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882595.43156: _low_level_execute_command(): starting 30529 1726882595.43160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/AnsiballZ_systemd.py && sleep 0' 30529 1726882595.43720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882595.43766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882595.43786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882595.43799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.43888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.72665: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10809344", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327053824", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1686005000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882595.74919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882595.74923: stdout chunk (state=3): >>><<< 30529 1726882595.74925: stderr chunk (state=3): >>><<< 30529 1726882595.74928: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10809344", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327053824", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1686005000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882595.75101: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882595.75155: _low_level_execute_command(): starting 30529 1726882595.75211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882595.0357258-30937-147466470945351/ > /dev/null 2>&1 && sleep 0' 30529 1726882595.76462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882595.76476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882595.76486: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882595.76503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882595.76680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882595.76724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882595.76746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882595.76952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882595.78766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882595.78776: stdout chunk (state=3): >>><<< 30529 1726882595.78807: stderr chunk (state=3): >>><<< 30529 1726882595.78845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882595.78858: handler run complete 30529 1726882595.79200: attempt loop complete, returning result 30529 1726882595.79204: _execute() done 30529 1726882595.79206: dumping result to json 30529 1726882595.79208: done dumping result, returning 30529 1726882595.79210: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000000218] 30529 1726882595.79211: sending task result for task 12673a56-9f93-b0f1-edc0-000000000218 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882595.79856: no more pending results, returning what we have 30529 1726882595.79860: results queue empty 30529 1726882595.79861: checking for any_errors_fatal 30529 1726882595.79866: done checking for any_errors_fatal 30529 1726882595.79867: checking for max_fail_percentage 30529 1726882595.79868: done checking for max_fail_percentage 30529 1726882595.79869: checking to see if all hosts have failed and the running result is not ok 30529 1726882595.79870: done checking to see if all hosts have failed 30529 1726882595.79870: getting the remaining hosts for this loop 30529 1726882595.79872: done getting the remaining hosts for this loop 30529 1726882595.79875: getting the next task for host managed_node1 30529 1726882595.79882: done getting next task for host managed_node1 30529 1726882595.79885: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882595.79890: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882595.79902: getting variables 30529 1726882595.79904: in VariableManager get_vars() 30529 1726882595.79938: Calling all_inventory to load vars for managed_node1 30529 1726882595.79941: Calling groups_inventory to load vars for managed_node1 30529 1726882595.79943: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882595.79955: Calling all_plugins_play to load vars for managed_node1 30529 1726882595.79958: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882595.79961: Calling groups_plugins_play to load vars for managed_node1 30529 1726882595.80652: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000218 30529 1726882595.80656: WORKER PROCESS EXITING 30529 1726882595.83530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882595.87948: done with get_vars() 30529 1726882595.87974: done getting variables 30529 1726882595.88149: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:35 -0400 (0:00:00.990) 0:00:09.907 ****** 30529 1726882595.88189: entering _queue_task() for managed_node1/service 30529 1726882595.88948: worker is 1 (out of 1 available) 30529 1726882595.88960: exiting _queue_task() for managed_node1/service 30529 1726882595.88972: done queuing things up, now waiting for results queue to drain 30529 1726882595.88974: waiting for pending results... 30529 1726882595.89496: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882595.90200: in run() - task 12673a56-9f93-b0f1-edc0-000000000219 30529 1726882595.90204: variable 'ansible_search_path' from source: unknown 30529 1726882595.90206: variable 'ansible_search_path' from source: unknown 30529 1726882595.90209: calling self._execute() 30529 1726882595.90211: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882595.90218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882595.90220: variable 'omit' from source: magic vars 30529 1726882595.91347: variable 'ansible_distribution_major_version' from source: facts 30529 1726882595.91614: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882595.91910: variable 'network_provider' from source: set_fact 30529 1726882595.92198: Evaluated conditional (network_provider == "nm"): True 30529 1726882595.92201: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882595.92482: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882595.93198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882595.97279: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882595.97375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882595.97567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882595.97669: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882595.97828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882595.97956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882595.98109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882595.98140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882595.98401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882595.98404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882595.98407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882595.98436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882595.98536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882595.98580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882595.98635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882595.98800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882595.98804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882595.98826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882595.98948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882595.98968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882595.99362: variable 'network_connections' from source: include params 30529 1726882595.99382: variable 'interface' from source: play vars 30529 1726882595.99512: variable 'interface' from source: play vars 30529 1726882595.99835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882596.00005: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882596.00048: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882596.00133: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882596.00196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882596.00320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882596.00348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882596.00411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882596.00520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882596.00570: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882596.01143: variable 'network_connections' from source: include params 30529 1726882596.01154: variable 'interface' from source: play vars 30529 1726882596.01221: variable 'interface' from source: play vars 30529 1726882596.01474: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882596.01477: when evaluation is False, skipping this task 30529 1726882596.01479: _execute() done 30529 1726882596.01482: dumping result to json 30529 1726882596.01484: done dumping result, returning 30529 1726882596.01486: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000000219] 30529 1726882596.01500: sending task result for task 12673a56-9f93-b0f1-edc0-000000000219 30529 1726882596.01567: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000219 30529 1726882596.01570: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882596.01651: no more pending results, returning what we have 30529 1726882596.01655: results queue empty 30529 1726882596.01655: checking for any_errors_fatal 30529 1726882596.01684: done checking for any_errors_fatal 30529 1726882596.01688: checking for max_fail_percentage 30529 1726882596.01691: done checking for max_fail_percentage 30529 1726882596.01691: checking to see if all hosts have failed and the running result is not ok 30529 1726882596.01692: done checking to see if all hosts have failed 30529 1726882596.01695: getting the remaining hosts for this loop 30529 1726882596.01697: done getting the remaining hosts for this loop 30529 1726882596.01701: getting the next task for host managed_node1 30529 1726882596.01708: done getting next task for host managed_node1 30529 1726882596.01712: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882596.01717: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882596.01730: getting variables 30529 1726882596.01731: in VariableManager get_vars() 30529 1726882596.01764: Calling all_inventory to load vars for managed_node1 30529 1726882596.01766: Calling groups_inventory to load vars for managed_node1 30529 1726882596.01768: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882596.01778: Calling all_plugins_play to load vars for managed_node1 30529 1726882596.01780: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882596.01782: Calling groups_plugins_play to load vars for managed_node1 30529 1726882596.05919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882596.10649: done with get_vars() 30529 1726882596.10672: done getting variables 30529 1726882596.10738: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:36 -0400 (0:00:00.225) 0:00:10.133 ****** 30529 1726882596.10771: entering _queue_task() for managed_node1/service 30529 1726882596.11523: worker is 1 (out of 1 available) 30529 1726882596.11535: exiting _queue_task() for managed_node1/service 30529 1726882596.11549: done queuing things up, now waiting for results queue to drain 30529 1726882596.11550: waiting for pending results... 30529 1726882596.12411: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882596.12417: in run() - task 12673a56-9f93-b0f1-edc0-00000000021a 30529 1726882596.12420: variable 'ansible_search_path' from source: unknown 30529 1726882596.12423: variable 'ansible_search_path' from source: unknown 30529 1726882596.12653: calling self._execute() 30529 1726882596.12657: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882596.12660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882596.12769: variable 'omit' from source: magic vars 30529 1726882596.13469: variable 'ansible_distribution_major_version' from source: facts 30529 1726882596.13532: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882596.13731: variable 'network_provider' from source: set_fact 30529 1726882596.13956: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882596.13959: when evaluation is False, skipping this task 30529 1726882596.13961: _execute() done 30529 1726882596.13964: dumping result to json 30529 1726882596.13967: done dumping result, returning 30529 1726882596.13970: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-00000000021a] 30529 1726882596.13972: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021a 30529 1726882596.14199: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021a 30529 1726882596.14202: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882596.14247: no more pending results, returning what we have 30529 1726882596.14252: results queue empty 30529 1726882596.14253: checking for any_errors_fatal 30529 1726882596.14264: done checking for any_errors_fatal 30529 1726882596.14265: checking for max_fail_percentage 30529 1726882596.14267: done checking for max_fail_percentage 30529 1726882596.14268: checking to see if all hosts have failed and the running result is not ok 30529 1726882596.14269: done checking to see if all hosts have failed 30529 1726882596.14269: getting the remaining hosts for this loop 30529 1726882596.14271: done getting the remaining hosts for this loop 30529 1726882596.14275: getting the next task for host managed_node1 30529 1726882596.14284: done getting next task for host managed_node1 30529 1726882596.14290: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882596.14298: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882596.14313: getting variables 30529 1726882596.14315: in VariableManager get_vars() 30529 1726882596.14348: Calling all_inventory to load vars for managed_node1 30529 1726882596.14351: Calling groups_inventory to load vars for managed_node1 30529 1726882596.14353: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882596.14366: Calling all_plugins_play to load vars for managed_node1 30529 1726882596.14369: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882596.14372: Calling groups_plugins_play to load vars for managed_node1 30529 1726882596.17167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882596.20479: done with get_vars() 30529 1726882596.20506: done getting variables 30529 1726882596.20567: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:36 -0400 (0:00:00.100) 0:00:10.234 ****** 30529 1726882596.20808: entering _queue_task() for managed_node1/copy 30529 1726882596.21326: worker is 1 (out of 1 available) 30529 1726882596.21339: exiting _queue_task() for managed_node1/copy 30529 1726882596.21351: done queuing things up, now waiting for results queue to drain 30529 1726882596.21353: waiting for pending results... 30529 1726882596.21920: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882596.22127: in run() - task 12673a56-9f93-b0f1-edc0-00000000021b 30529 1726882596.22150: variable 'ansible_search_path' from source: unknown 30529 1726882596.22234: variable 'ansible_search_path' from source: unknown 30529 1726882596.22250: calling self._execute() 30529 1726882596.22600: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882596.22605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882596.22709: variable 'omit' from source: magic vars 30529 1726882596.23316: variable 'ansible_distribution_major_version' from source: facts 30529 1726882596.23334: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882596.23581: variable 'network_provider' from source: set_fact 30529 1726882596.23584: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882596.23589: when evaluation is False, skipping this task 30529 1726882596.23599: _execute() done 30529 1726882596.23608: dumping result to json 30529 1726882596.23700: done dumping result, returning 30529 1726882596.23715: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-00000000021b] 30529 1726882596.23726: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021b skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882596.23885: no more pending results, returning what we have 30529 1726882596.23892: results queue empty 30529 1726882596.23895: checking for any_errors_fatal 30529 1726882596.23902: done checking for any_errors_fatal 30529 1726882596.23903: checking for max_fail_percentage 30529 1726882596.23905: done checking for max_fail_percentage 30529 1726882596.23906: checking to see if all hosts have failed and the running result is not ok 30529 1726882596.23907: done checking to see if all hosts have failed 30529 1726882596.23908: getting the remaining hosts for this loop 30529 1726882596.23910: done getting the remaining hosts for this loop 30529 1726882596.23914: getting the next task for host managed_node1 30529 1726882596.23922: done getting next task for host managed_node1 30529 1726882596.23926: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882596.23932: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882596.23948: getting variables 30529 1726882596.23950: in VariableManager get_vars() 30529 1726882596.23988: Calling all_inventory to load vars for managed_node1 30529 1726882596.23991: Calling groups_inventory to load vars for managed_node1 30529 1726882596.23996: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882596.24009: Calling all_plugins_play to load vars for managed_node1 30529 1726882596.24014: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882596.24017: Calling groups_plugins_play to load vars for managed_node1 30529 1726882596.25200: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021b 30529 1726882596.25204: WORKER PROCESS EXITING 30529 1726882596.27291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882596.30483: done with get_vars() 30529 1726882596.30513: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:36 -0400 (0:00:00.099) 0:00:10.334 ****** 30529 1726882596.30807: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882596.30809: Creating lock for fedora.linux_system_roles.network_connections 30529 1726882596.31353: worker is 1 (out of 1 available) 30529 1726882596.31365: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882596.31378: done queuing things up, now waiting for results queue to drain 30529 1726882596.31379: waiting for pending results... 30529 1726882596.31970: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882596.32618: in run() - task 12673a56-9f93-b0f1-edc0-00000000021c 30529 1726882596.32633: variable 'ansible_search_path' from source: unknown 30529 1726882596.32636: variable 'ansible_search_path' from source: unknown 30529 1726882596.32669: calling self._execute() 30529 1726882596.32868: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882596.32872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882596.32882: variable 'omit' from source: magic vars 30529 1726882596.34010: variable 'ansible_distribution_major_version' from source: facts 30529 1726882596.34227: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882596.34232: variable 'omit' from source: magic vars 30529 1726882596.34234: variable 'omit' from source: magic vars 30529 1726882596.34771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882596.39451: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882596.39571: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882596.39764: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882596.39805: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882596.40052: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882596.40056: variable 'network_provider' from source: set_fact 30529 1726882596.40264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882596.40491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882596.40497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882596.40499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882596.40708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882596.40711: variable 'omit' from source: magic vars 30529 1726882596.41025: variable 'omit' from source: magic vars 30529 1726882596.41261: variable 'network_connections' from source: include params 30529 1726882596.41276: variable 'interface' from source: play vars 30529 1726882596.41423: variable 'interface' from source: play vars 30529 1726882596.41718: variable 'omit' from source: magic vars 30529 1726882596.41906: variable '__lsr_ansible_managed' from source: task vars 30529 1726882596.41909: variable '__lsr_ansible_managed' from source: task vars 30529 1726882596.42243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882596.43100: Loaded config def from plugin (lookup/template) 30529 1726882596.43104: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882596.43106: File lookup term: get_ansible_managed.j2 30529 1726882596.43108: variable 'ansible_search_path' from source: unknown 30529 1726882596.43111: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882596.43114: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882596.43116: variable 'ansible_search_path' from source: unknown 30529 1726882596.52008: variable 'ansible_managed' from source: unknown 30529 1726882596.52221: variable 'omit' from source: magic vars 30529 1726882596.52297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882596.52328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882596.52349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882596.52381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882596.52398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882596.52430: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882596.52438: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882596.52446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882596.52561: Set connection var ansible_shell_executable to /bin/sh 30529 1726882596.52579: Set connection var ansible_pipelining to False 30529 1726882596.52590: Set connection var ansible_shell_type to sh 30529 1726882596.52606: Set connection var ansible_timeout to 10 30529 1726882596.52612: Set connection var ansible_connection to ssh 30529 1726882596.52621: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882596.52647: variable 'ansible_shell_executable' from source: unknown 30529 1726882596.52655: variable 'ansible_connection' from source: unknown 30529 1726882596.52661: variable 'ansible_module_compression' from source: unknown 30529 1726882596.52686: variable 'ansible_shell_type' from source: unknown 30529 1726882596.52692: variable 'ansible_shell_executable' from source: unknown 30529 1726882596.52696: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882596.52698: variable 'ansible_pipelining' from source: unknown 30529 1726882596.52701: variable 'ansible_timeout' from source: unknown 30529 1726882596.52799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882596.52844: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882596.52869: variable 'omit' from source: magic vars 30529 1726882596.52882: starting attempt loop 30529 1726882596.52890: running the handler 30529 1726882596.52919: _low_level_execute_command(): starting 30529 1726882596.52933: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882596.53684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.53754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882596.53802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882596.53840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882596.53925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882596.55576: stdout chunk (state=3): >>>/root <<< 30529 1726882596.55826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882596.55829: stdout chunk (state=3): >>><<< 30529 1726882596.55832: stderr chunk (state=3): >>><<< 30529 1726882596.55834: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882596.55916: _low_level_execute_command(): starting 30529 1726882596.55920: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982 `" && echo ansible-tmp-1726882596.558436-30986-221066711315982="` echo /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982 `" ) && sleep 0' 30529 1726882596.56758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882596.56772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882596.56787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.56905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882596.56910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882596.56954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882596.56995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882596.58858: stdout chunk (state=3): >>>ansible-tmp-1726882596.558436-30986-221066711315982=/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982 <<< 30529 1726882596.58999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882596.59002: stdout chunk (state=3): >>><<< 30529 1726882596.59005: stderr chunk (state=3): >>><<< 30529 1726882596.59399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882596.558436-30986-221066711315982=/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882596.59402: variable 'ansible_module_compression' from source: unknown 30529 1726882596.59404: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 30529 1726882596.59406: ANSIBALLZ: Acquiring lock 30529 1726882596.59408: ANSIBALLZ: Lock acquired: 139794688685888 30529 1726882596.59410: ANSIBALLZ: Creating module 30529 1726882596.89350: ANSIBALLZ: Writing module into payload 30529 1726882596.89699: ANSIBALLZ: Writing module 30529 1726882596.89746: ANSIBALLZ: Renaming module 30529 1726882596.89761: ANSIBALLZ: Done creating module 30529 1726882596.89806: variable 'ansible_facts' from source: unknown 30529 1726882596.90110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py 30529 1726882596.90531: Sending initial data 30529 1726882596.90535: Sent initial data (167 bytes) 30529 1726882596.91154: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882596.91170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882596.91185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.91236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882596.91252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.91277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.91401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.91405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882596.91408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882596.91443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882596.91540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882596.93127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882596.93162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882596.93226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5bzmb163 /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py <<< 30529 1726882596.93229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py" <<< 30529 1726882596.93259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5bzmb163" to remote "/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py" <<< 30529 1726882596.94358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882596.94362: stdout chunk (state=3): >>><<< 30529 1726882596.94364: stderr chunk (state=3): >>><<< 30529 1726882596.94367: done transferring module to remote 30529 1726882596.94381: _low_level_execute_command(): starting 30529 1726882596.94390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/ /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py && sleep 0' 30529 1726882596.94964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882596.94973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882596.94984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.95017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882596.95020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882596.95023: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882596.95097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.95100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882596.95103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882596.95105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882596.95107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882596.95108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.95111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882596.95114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882596.95116: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882596.95128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.95168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882596.95180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882596.95190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882596.95265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882596.97029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882596.97050: stderr chunk (state=3): >>><<< 30529 1726882596.97053: stdout chunk (state=3): >>><<< 30529 1726882596.97068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882596.97145: _low_level_execute_command(): starting 30529 1726882596.97148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/AnsiballZ_network_connections.py && sleep 0' 30529 1726882596.97659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882596.97672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882596.97688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882596.97710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882596.97726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882596.97810: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882596.97835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882596.97859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882596.97924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.24776: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882597.27867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882597.27872: stdout chunk (state=3): >>><<< 30529 1726882597.27881: stderr chunk (state=3): >>><<< 30529 1726882597.27898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882597.27949: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882597.27958: _low_level_execute_command(): starting 30529 1726882597.27976: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882596.558436-30986-221066711315982/ > /dev/null 2>&1 && sleep 0' 30529 1726882597.28610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.28628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882597.28644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.28662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.28763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.30863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.30897: stderr chunk (state=3): >>><<< 30529 1726882597.30901: stdout chunk (state=3): >>><<< 30529 1726882597.30910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882597.30917: handler run complete 30529 1726882597.30938: attempt loop complete, returning result 30529 1726882597.30940: _execute() done 30529 1726882597.30943: dumping result to json 30529 1726882597.30947: done dumping result, returning 30529 1726882597.30955: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-00000000021c] 30529 1726882597.30958: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021c 30529 1726882597.31055: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021c 30529 1726882597.31058: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9 30529 1726882597.31150: no more pending results, returning what we have 30529 1726882597.31153: results queue empty 30529 1726882597.31154: checking for any_errors_fatal 30529 1726882597.31162: done checking for any_errors_fatal 30529 1726882597.31163: checking for max_fail_percentage 30529 1726882597.31164: done checking for max_fail_percentage 30529 1726882597.31165: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.31166: done checking to see if all hosts have failed 30529 1726882597.31167: getting the remaining hosts for this loop 30529 1726882597.31168: done getting the remaining hosts for this loop 30529 1726882597.31172: getting the next task for host managed_node1 30529 1726882597.31179: done getting next task for host managed_node1 30529 1726882597.31182: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882597.31186: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.31198: getting variables 30529 1726882597.31200: in VariableManager get_vars() 30529 1726882597.31232: Calling all_inventory to load vars for managed_node1 30529 1726882597.31234: Calling groups_inventory to load vars for managed_node1 30529 1726882597.31236: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.31246: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.31248: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.31250: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.32162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.33090: done with get_vars() 30529 1726882597.33108: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:37 -0400 (0:00:01.023) 0:00:11.357 ****** 30529 1726882597.33166: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882597.33168: Creating lock for fedora.linux_system_roles.network_state 30529 1726882597.33391: worker is 1 (out of 1 available) 30529 1726882597.33406: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882597.33421: done queuing things up, now waiting for results queue to drain 30529 1726882597.33423: waiting for pending results... 30529 1726882597.33674: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882597.33842: in run() - task 12673a56-9f93-b0f1-edc0-00000000021d 30529 1726882597.33847: variable 'ansible_search_path' from source: unknown 30529 1726882597.33851: variable 'ansible_search_path' from source: unknown 30529 1726882597.33853: calling self._execute() 30529 1726882597.33952: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.33955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.33958: variable 'omit' from source: magic vars 30529 1726882597.34279: variable 'ansible_distribution_major_version' from source: facts 30529 1726882597.34290: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882597.34436: variable 'network_state' from source: role '' defaults 30529 1726882597.34443: Evaluated conditional (network_state != {}): False 30529 1726882597.34446: when evaluation is False, skipping this task 30529 1726882597.34449: _execute() done 30529 1726882597.34451: dumping result to json 30529 1726882597.34454: done dumping result, returning 30529 1726882597.34461: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-00000000021d] 30529 1726882597.34465: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021d 30529 1726882597.34551: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021d 30529 1726882597.34554: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882597.34615: no more pending results, returning what we have 30529 1726882597.34619: results queue empty 30529 1726882597.34620: checking for any_errors_fatal 30529 1726882597.34627: done checking for any_errors_fatal 30529 1726882597.34628: checking for max_fail_percentage 30529 1726882597.34629: done checking for max_fail_percentage 30529 1726882597.34630: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.34631: done checking to see if all hosts have failed 30529 1726882597.34632: getting the remaining hosts for this loop 30529 1726882597.34633: done getting the remaining hosts for this loop 30529 1726882597.34639: getting the next task for host managed_node1 30529 1726882597.34646: done getting next task for host managed_node1 30529 1726882597.34649: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882597.34654: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.34672: getting variables 30529 1726882597.34674: in VariableManager get_vars() 30529 1726882597.34735: Calling all_inventory to load vars for managed_node1 30529 1726882597.34737: Calling groups_inventory to load vars for managed_node1 30529 1726882597.34739: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.34745: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.34747: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.34748: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.35869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.38088: done with get_vars() 30529 1726882597.38120: done getting variables 30529 1726882597.38187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:37 -0400 (0:00:00.050) 0:00:11.408 ****** 30529 1726882597.38223: entering _queue_task() for managed_node1/debug 30529 1726882597.38443: worker is 1 (out of 1 available) 30529 1726882597.38456: exiting _queue_task() for managed_node1/debug 30529 1726882597.38468: done queuing things up, now waiting for results queue to drain 30529 1726882597.38470: waiting for pending results... 30529 1726882597.38648: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882597.38732: in run() - task 12673a56-9f93-b0f1-edc0-00000000021e 30529 1726882597.38743: variable 'ansible_search_path' from source: unknown 30529 1726882597.38746: variable 'ansible_search_path' from source: unknown 30529 1726882597.38775: calling self._execute() 30529 1726882597.38850: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.38855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.38863: variable 'omit' from source: magic vars 30529 1726882597.39126: variable 'ansible_distribution_major_version' from source: facts 30529 1726882597.39136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882597.39145: variable 'omit' from source: magic vars 30529 1726882597.39187: variable 'omit' from source: magic vars 30529 1726882597.39213: variable 'omit' from source: magic vars 30529 1726882597.39248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882597.39276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882597.39294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882597.39308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.39318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.39342: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882597.39346: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.39350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.39420: Set connection var ansible_shell_executable to /bin/sh 30529 1726882597.39423: Set connection var ansible_pipelining to False 30529 1726882597.39426: Set connection var ansible_shell_type to sh 30529 1726882597.39433: Set connection var ansible_timeout to 10 30529 1726882597.39436: Set connection var ansible_connection to ssh 30529 1726882597.39440: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882597.39460: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.39463: variable 'ansible_connection' from source: unknown 30529 1726882597.39465: variable 'ansible_module_compression' from source: unknown 30529 1726882597.39468: variable 'ansible_shell_type' from source: unknown 30529 1726882597.39470: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.39473: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.39475: variable 'ansible_pipelining' from source: unknown 30529 1726882597.39477: variable 'ansible_timeout' from source: unknown 30529 1726882597.39480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.39576: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882597.39585: variable 'omit' from source: magic vars 30529 1726882597.39597: starting attempt loop 30529 1726882597.39603: running the handler 30529 1726882597.39692: variable '__network_connections_result' from source: set_fact 30529 1726882597.39735: handler run complete 30529 1726882597.39748: attempt loop complete, returning result 30529 1726882597.39751: _execute() done 30529 1726882597.39753: dumping result to json 30529 1726882597.39755: done dumping result, returning 30529 1726882597.39763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000021e] 30529 1726882597.39768: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021e 30529 1726882597.39850: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021e 30529 1726882597.39853: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9" ] } 30529 1726882597.39935: no more pending results, returning what we have 30529 1726882597.39938: results queue empty 30529 1726882597.39938: checking for any_errors_fatal 30529 1726882597.39943: done checking for any_errors_fatal 30529 1726882597.39944: checking for max_fail_percentage 30529 1726882597.39945: done checking for max_fail_percentage 30529 1726882597.39946: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.39946: done checking to see if all hosts have failed 30529 1726882597.39947: getting the remaining hosts for this loop 30529 1726882597.39948: done getting the remaining hosts for this loop 30529 1726882597.39951: getting the next task for host managed_node1 30529 1726882597.39957: done getting next task for host managed_node1 30529 1726882597.39960: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882597.39966: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.39975: getting variables 30529 1726882597.39977: in VariableManager get_vars() 30529 1726882597.40007: Calling all_inventory to load vars for managed_node1 30529 1726882597.40010: Calling groups_inventory to load vars for managed_node1 30529 1726882597.40013: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.40021: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.40023: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.40026: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.41334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.43410: done with get_vars() 30529 1726882597.43430: done getting variables 30529 1726882597.43482: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:37 -0400 (0:00:00.053) 0:00:11.461 ****** 30529 1726882597.43527: entering _queue_task() for managed_node1/debug 30529 1726882597.43775: worker is 1 (out of 1 available) 30529 1726882597.43787: exiting _queue_task() for managed_node1/debug 30529 1726882597.43800: done queuing things up, now waiting for results queue to drain 30529 1726882597.43802: waiting for pending results... 30529 1726882597.44208: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882597.44213: in run() - task 12673a56-9f93-b0f1-edc0-00000000021f 30529 1726882597.44216: variable 'ansible_search_path' from source: unknown 30529 1726882597.44218: variable 'ansible_search_path' from source: unknown 30529 1726882597.44243: calling self._execute() 30529 1726882597.44335: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.44346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.44359: variable 'omit' from source: magic vars 30529 1726882597.44709: variable 'ansible_distribution_major_version' from source: facts 30529 1726882597.44726: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882597.44738: variable 'omit' from source: magic vars 30529 1726882597.44814: variable 'omit' from source: magic vars 30529 1726882597.44898: variable 'omit' from source: magic vars 30529 1726882597.44938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882597.45208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882597.45211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882597.45215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.45390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.45396: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882597.45399: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.45401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.45547: Set connection var ansible_shell_executable to /bin/sh 30529 1726882597.45552: Set connection var ansible_pipelining to False 30529 1726882597.45555: Set connection var ansible_shell_type to sh 30529 1726882597.45570: Set connection var ansible_timeout to 10 30529 1726882597.45573: Set connection var ansible_connection to ssh 30529 1726882597.45575: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882597.45846: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.45849: variable 'ansible_connection' from source: unknown 30529 1726882597.45852: variable 'ansible_module_compression' from source: unknown 30529 1726882597.45854: variable 'ansible_shell_type' from source: unknown 30529 1726882597.45856: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.45858: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.45860: variable 'ansible_pipelining' from source: unknown 30529 1726882597.45862: variable 'ansible_timeout' from source: unknown 30529 1726882597.45864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.45976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882597.45999: variable 'omit' from source: magic vars 30529 1726882597.46011: starting attempt loop 30529 1726882597.46019: running the handler 30529 1726882597.46077: variable '__network_connections_result' from source: set_fact 30529 1726882597.46216: variable '__network_connections_result' from source: set_fact 30529 1726882597.46309: handler run complete 30529 1726882597.46344: attempt loop complete, returning result 30529 1726882597.46351: _execute() done 30529 1726882597.46358: dumping result to json 30529 1726882597.46366: done dumping result, returning 30529 1726882597.46378: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000021f] 30529 1726882597.46396: sending task result for task 12673a56-9f93-b0f1-edc0-00000000021f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9" ] } } 30529 1726882597.46685: no more pending results, returning what we have 30529 1726882597.46689: results queue empty 30529 1726882597.46690: checking for any_errors_fatal 30529 1726882597.46699: done checking for any_errors_fatal 30529 1726882597.46700: checking for max_fail_percentage 30529 1726882597.46702: done checking for max_fail_percentage 30529 1726882597.46702: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.46703: done checking to see if all hosts have failed 30529 1726882597.46704: getting the remaining hosts for this loop 30529 1726882597.46706: done getting the remaining hosts for this loop 30529 1726882597.46710: getting the next task for host managed_node1 30529 1726882597.46721: done getting next task for host managed_node1 30529 1726882597.46725: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882597.46729: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.46739: getting variables 30529 1726882597.46741: in VariableManager get_vars() 30529 1726882597.46773: Calling all_inventory to load vars for managed_node1 30529 1726882597.46775: Calling groups_inventory to load vars for managed_node1 30529 1726882597.46783: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.46940: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.46944: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.46950: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000021f 30529 1726882597.46953: WORKER PROCESS EXITING 30529 1726882597.46957: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.48369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.49978: done with get_vars() 30529 1726882597.50002: done getting variables 30529 1726882597.50068: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:37 -0400 (0:00:00.065) 0:00:11.527 ****** 30529 1726882597.50105: entering _queue_task() for managed_node1/debug 30529 1726882597.50425: worker is 1 (out of 1 available) 30529 1726882597.50436: exiting _queue_task() for managed_node1/debug 30529 1726882597.50447: done queuing things up, now waiting for results queue to drain 30529 1726882597.50449: waiting for pending results... 30529 1726882597.50729: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882597.50864: in run() - task 12673a56-9f93-b0f1-edc0-000000000220 30529 1726882597.50882: variable 'ansible_search_path' from source: unknown 30529 1726882597.50897: variable 'ansible_search_path' from source: unknown 30529 1726882597.50940: calling self._execute() 30529 1726882597.51022: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.51048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.51050: variable 'omit' from source: magic vars 30529 1726882597.51432: variable 'ansible_distribution_major_version' from source: facts 30529 1726882597.51485: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882597.51582: variable 'network_state' from source: role '' defaults 30529 1726882597.51608: Evaluated conditional (network_state != {}): False 30529 1726882597.51618: when evaluation is False, skipping this task 30529 1726882597.51626: _execute() done 30529 1726882597.51633: dumping result to json 30529 1726882597.51708: done dumping result, returning 30529 1726882597.51712: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000000220] 30529 1726882597.51714: sending task result for task 12673a56-9f93-b0f1-edc0-000000000220 30529 1726882597.51781: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000220 30529 1726882597.51784: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882597.51857: no more pending results, returning what we have 30529 1726882597.51861: results queue empty 30529 1726882597.51862: checking for any_errors_fatal 30529 1726882597.51875: done checking for any_errors_fatal 30529 1726882597.51876: checking for max_fail_percentage 30529 1726882597.51878: done checking for max_fail_percentage 30529 1726882597.51879: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.51880: done checking to see if all hosts have failed 30529 1726882597.51881: getting the remaining hosts for this loop 30529 1726882597.51882: done getting the remaining hosts for this loop 30529 1726882597.51887: getting the next task for host managed_node1 30529 1726882597.51897: done getting next task for host managed_node1 30529 1726882597.51902: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882597.51906: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.51921: getting variables 30529 1726882597.51923: in VariableManager get_vars() 30529 1726882597.51957: Calling all_inventory to load vars for managed_node1 30529 1726882597.51959: Calling groups_inventory to load vars for managed_node1 30529 1726882597.51962: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.51975: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.51978: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.51981: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.53478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.54566: done with get_vars() 30529 1726882597.54581: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:37 -0400 (0:00:00.045) 0:00:11.572 ****** 30529 1726882597.54648: entering _queue_task() for managed_node1/ping 30529 1726882597.54649: Creating lock for ping 30529 1726882597.54848: worker is 1 (out of 1 available) 30529 1726882597.54860: exiting _queue_task() for managed_node1/ping 30529 1726882597.54872: done queuing things up, now waiting for results queue to drain 30529 1726882597.54875: waiting for pending results... 30529 1726882597.55046: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882597.55139: in run() - task 12673a56-9f93-b0f1-edc0-000000000221 30529 1726882597.55150: variable 'ansible_search_path' from source: unknown 30529 1726882597.55153: variable 'ansible_search_path' from source: unknown 30529 1726882597.55181: calling self._execute() 30529 1726882597.55248: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.55252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.55259: variable 'omit' from source: magic vars 30529 1726882597.55525: variable 'ansible_distribution_major_version' from source: facts 30529 1726882597.55541: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882597.55545: variable 'omit' from source: magic vars 30529 1726882597.55585: variable 'omit' from source: magic vars 30529 1726882597.55611: variable 'omit' from source: magic vars 30529 1726882597.55645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882597.55673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882597.55687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882597.55705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.55716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882597.55737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882597.55740: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.55742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.55817: Set connection var ansible_shell_executable to /bin/sh 30529 1726882597.55821: Set connection var ansible_pipelining to False 30529 1726882597.55824: Set connection var ansible_shell_type to sh 30529 1726882597.55831: Set connection var ansible_timeout to 10 30529 1726882597.55834: Set connection var ansible_connection to ssh 30529 1726882597.55838: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882597.55856: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.55859: variable 'ansible_connection' from source: unknown 30529 1726882597.55863: variable 'ansible_module_compression' from source: unknown 30529 1726882597.55866: variable 'ansible_shell_type' from source: unknown 30529 1726882597.55868: variable 'ansible_shell_executable' from source: unknown 30529 1726882597.55871: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882597.55874: variable 'ansible_pipelining' from source: unknown 30529 1726882597.55876: variable 'ansible_timeout' from source: unknown 30529 1726882597.55878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882597.56022: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882597.56031: variable 'omit' from source: magic vars 30529 1726882597.56036: starting attempt loop 30529 1726882597.56038: running the handler 30529 1726882597.56049: _low_level_execute_command(): starting 30529 1726882597.56056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882597.56800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.56810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.56862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.58538: stdout chunk (state=3): >>>/root <<< 30529 1726882597.58625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.58660: stderr chunk (state=3): >>><<< 30529 1726882597.58662: stdout chunk (state=3): >>><<< 30529 1726882597.58699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882597.58703: _low_level_execute_command(): starting 30529 1726882597.58706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161 `" && echo ansible-tmp-1726882597.5867934-31034-270768662842161="` echo /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161 `" ) && sleep 0' 30529 1726882597.59080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882597.59083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882597.59101: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.59125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882597.59134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.59172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.59184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.59234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.61105: stdout chunk (state=3): >>>ansible-tmp-1726882597.5867934-31034-270768662842161=/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161 <<< 30529 1726882597.61242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.61244: stdout chunk (state=3): >>><<< 30529 1726882597.61246: stderr chunk (state=3): >>><<< 30529 1726882597.61270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882597.5867934-31034-270768662842161=/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882597.61314: variable 'ansible_module_compression' from source: unknown 30529 1726882597.61346: ANSIBALLZ: Using lock for ping 30529 1726882597.61349: ANSIBALLZ: Acquiring lock 30529 1726882597.61351: ANSIBALLZ: Lock acquired: 139794696816496 30529 1726882597.61354: ANSIBALLZ: Creating module 30529 1726882597.71165: ANSIBALLZ: Writing module into payload 30529 1726882597.71204: ANSIBALLZ: Writing module 30529 1726882597.71219: ANSIBALLZ: Renaming module 30529 1726882597.71224: ANSIBALLZ: Done creating module 30529 1726882597.71238: variable 'ansible_facts' from source: unknown 30529 1726882597.71288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py 30529 1726882597.71391: Sending initial data 30529 1726882597.71397: Sent initial data (153 bytes) 30529 1726882597.71846: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882597.71849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.71852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882597.71855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.71908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.71939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.71968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.73525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882597.73528: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882597.73565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882597.73622: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwze9x4o5 /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py <<< 30529 1726882597.73629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py" <<< 30529 1726882597.73671: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwze9x4o5" to remote "/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py" <<< 30529 1726882597.74620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.74623: stderr chunk (state=3): >>><<< 30529 1726882597.74625: stdout chunk (state=3): >>><<< 30529 1726882597.74797: done transferring module to remote 30529 1726882597.74802: _low_level_execute_command(): starting 30529 1726882597.74807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/ /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py && sleep 0' 30529 1726882597.75920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882597.75924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.76099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882597.76171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.76242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.76353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.78115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.78136: stdout chunk (state=3): >>><<< 30529 1726882597.78139: stderr chunk (state=3): >>><<< 30529 1726882597.78234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882597.78237: _low_level_execute_command(): starting 30529 1726882597.78241: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/AnsiballZ_ping.py && sleep 0' 30529 1726882597.78746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882597.78755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882597.78831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.78862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882597.78873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882597.78921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.78972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.93865: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882597.95038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882597.95060: stderr chunk (state=3): >>><<< 30529 1726882597.95063: stdout chunk (state=3): >>><<< 30529 1726882597.95077: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882597.95120: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882597.95126: _low_level_execute_command(): starting 30529 1726882597.95128: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882597.5867934-31034-270768662842161/ > /dev/null 2>&1 && sleep 0' 30529 1726882597.95622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882597.95626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882597.95628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.95630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882597.95632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882597.95634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882597.95692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882597.95700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882597.95743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882597.97519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882597.97541: stderr chunk (state=3): >>><<< 30529 1726882597.97544: stdout chunk (state=3): >>><<< 30529 1726882597.97557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882597.97562: handler run complete 30529 1726882597.97573: attempt loop complete, returning result 30529 1726882597.97575: _execute() done 30529 1726882597.97578: dumping result to json 30529 1726882597.97580: done dumping result, returning 30529 1726882597.97595: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000000221] 30529 1726882597.97599: sending task result for task 12673a56-9f93-b0f1-edc0-000000000221 30529 1726882597.97741: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000221 30529 1726882597.97744: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882597.97856: no more pending results, returning what we have 30529 1726882597.97859: results queue empty 30529 1726882597.97860: checking for any_errors_fatal 30529 1726882597.97867: done checking for any_errors_fatal 30529 1726882597.97867: checking for max_fail_percentage 30529 1726882597.97869: done checking for max_fail_percentage 30529 1726882597.97869: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.97870: done checking to see if all hosts have failed 30529 1726882597.97871: getting the remaining hosts for this loop 30529 1726882597.97873: done getting the remaining hosts for this loop 30529 1726882597.97876: getting the next task for host managed_node1 30529 1726882597.97886: done getting next task for host managed_node1 30529 1726882597.97888: ^ task is: TASK: meta (role_complete) 30529 1726882597.97892: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.97917: getting variables 30529 1726882597.97919: in VariableManager get_vars() 30529 1726882597.97954: Calling all_inventory to load vars for managed_node1 30529 1726882597.97957: Calling groups_inventory to load vars for managed_node1 30529 1726882597.97959: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.97969: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.97972: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.97974: Calling groups_plugins_play to load vars for managed_node1 30529 1726882597.98927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882597.99885: done with get_vars() 30529 1726882597.99905: done getting variables 30529 1726882597.99960: done queuing things up, now waiting for results queue to drain 30529 1726882597.99962: results queue empty 30529 1726882597.99963: checking for any_errors_fatal 30529 1726882597.99965: done checking for any_errors_fatal 30529 1726882597.99965: checking for max_fail_percentage 30529 1726882597.99966: done checking for max_fail_percentage 30529 1726882597.99967: checking to see if all hosts have failed and the running result is not ok 30529 1726882597.99967: done checking to see if all hosts have failed 30529 1726882597.99967: getting the remaining hosts for this loop 30529 1726882597.99968: done getting the remaining hosts for this loop 30529 1726882597.99970: getting the next task for host managed_node1 30529 1726882597.99973: done getting next task for host managed_node1 30529 1726882597.99974: ^ task is: TASK: Show result 30529 1726882597.99976: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882597.99978: getting variables 30529 1726882597.99978: in VariableManager get_vars() 30529 1726882597.99985: Calling all_inventory to load vars for managed_node1 30529 1726882597.99988: Calling groups_inventory to load vars for managed_node1 30529 1726882597.99990: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882597.99995: Calling all_plugins_play to load vars for managed_node1 30529 1726882597.99997: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882597.99999: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.00727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.01615: done with get_vars() 30529 1726882598.01638: done getting variables 30529 1726882598.01674: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:36:38 -0400 (0:00:00.470) 0:00:12.043 ****** 30529 1726882598.01708: entering _queue_task() for managed_node1/debug 30529 1726882598.02016: worker is 1 (out of 1 available) 30529 1726882598.02030: exiting _queue_task() for managed_node1/debug 30529 1726882598.02043: done queuing things up, now waiting for results queue to drain 30529 1726882598.02045: waiting for pending results... 30529 1726882598.02243: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882598.02324: in run() - task 12673a56-9f93-b0f1-edc0-00000000018f 30529 1726882598.02339: variable 'ansible_search_path' from source: unknown 30529 1726882598.02342: variable 'ansible_search_path' from source: unknown 30529 1726882598.02370: calling self._execute() 30529 1726882598.02450: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.02454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.02462: variable 'omit' from source: magic vars 30529 1726882598.02935: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.02939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.02942: variable 'omit' from source: magic vars 30529 1726882598.02991: variable 'omit' from source: magic vars 30529 1726882598.03040: variable 'omit' from source: magic vars 30529 1726882598.03099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882598.03150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882598.03192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882598.03207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.03259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.03269: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882598.03301: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.03305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.03508: Set connection var ansible_shell_executable to /bin/sh 30529 1726882598.03514: Set connection var ansible_pipelining to False 30529 1726882598.03521: Set connection var ansible_shell_type to sh 30529 1726882598.03539: Set connection var ansible_timeout to 10 30529 1726882598.03547: Set connection var ansible_connection to ssh 30529 1726882598.03558: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882598.03618: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.03621: variable 'ansible_connection' from source: unknown 30529 1726882598.03630: variable 'ansible_module_compression' from source: unknown 30529 1726882598.03632: variable 'ansible_shell_type' from source: unknown 30529 1726882598.03635: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.03637: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.03639: variable 'ansible_pipelining' from source: unknown 30529 1726882598.03799: variable 'ansible_timeout' from source: unknown 30529 1726882598.03802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.03970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882598.03989: variable 'omit' from source: magic vars 30529 1726882598.04004: starting attempt loop 30529 1726882598.04014: running the handler 30529 1726882598.04114: variable '__network_connections_result' from source: set_fact 30529 1726882598.04367: variable '__network_connections_result' from source: set_fact 30529 1726882598.04626: handler run complete 30529 1726882598.04692: attempt loop complete, returning result 30529 1726882598.04697: _execute() done 30529 1726882598.04700: dumping result to json 30529 1726882598.04703: done dumping result, returning 30529 1726882598.04706: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-00000000018f] 30529 1726882598.04708: sending task result for task 12673a56-9f93-b0f1-edc0-00000000018f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 83eb4df0-1051-4e47-8e0d-c7726e739ed9" ] } } 30529 1726882598.04914: no more pending results, returning what we have 30529 1726882598.04918: results queue empty 30529 1726882598.04920: checking for any_errors_fatal 30529 1726882598.04922: done checking for any_errors_fatal 30529 1726882598.04922: checking for max_fail_percentage 30529 1726882598.04924: done checking for max_fail_percentage 30529 1726882598.04925: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.04926: done checking to see if all hosts have failed 30529 1726882598.04927: getting the remaining hosts for this loop 30529 1726882598.04929: done getting the remaining hosts for this loop 30529 1726882598.04933: getting the next task for host managed_node1 30529 1726882598.04946: done getting next task for host managed_node1 30529 1726882598.04951: ^ task is: TASK: Asserts 30529 1726882598.04954: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.04960: getting variables 30529 1726882598.04962: in VariableManager get_vars() 30529 1726882598.05226: Calling all_inventory to load vars for managed_node1 30529 1726882598.05229: Calling groups_inventory to load vars for managed_node1 30529 1726882598.05234: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.05247: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.05251: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.05254: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.05916: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000018f 30529 1726882598.05919: WORKER PROCESS EXITING 30529 1726882598.07596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.08611: done with get_vars() 30529 1726882598.08626: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:36:38 -0400 (0:00:00.069) 0:00:12.113 ****** 30529 1726882598.08701: entering _queue_task() for managed_node1/include_tasks 30529 1726882598.08934: worker is 1 (out of 1 available) 30529 1726882598.08947: exiting _queue_task() for managed_node1/include_tasks 30529 1726882598.08959: done queuing things up, now waiting for results queue to drain 30529 1726882598.08960: waiting for pending results... 30529 1726882598.09147: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882598.09282: in run() - task 12673a56-9f93-b0f1-edc0-000000000096 30529 1726882598.09300: variable 'ansible_search_path' from source: unknown 30529 1726882598.09306: variable 'ansible_search_path' from source: unknown 30529 1726882598.09353: variable 'lsr_assert' from source: include params 30529 1726882598.09507: variable 'lsr_assert' from source: include params 30529 1726882598.09559: variable 'omit' from source: magic vars 30529 1726882598.09651: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.09658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.09667: variable 'omit' from source: magic vars 30529 1726882598.09831: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.09839: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.09845: variable 'item' from source: unknown 30529 1726882598.09897: variable 'item' from source: unknown 30529 1726882598.09918: variable 'item' from source: unknown 30529 1726882598.09961: variable 'item' from source: unknown 30529 1726882598.10085: dumping result to json 30529 1726882598.10088: done dumping result, returning 30529 1726882598.10090: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-000000000096] 30529 1726882598.10092: sending task result for task 12673a56-9f93-b0f1-edc0-000000000096 30529 1726882598.10128: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000096 30529 1726882598.10130: WORKER PROCESS EXITING 30529 1726882598.10156: no more pending results, returning what we have 30529 1726882598.10161: in VariableManager get_vars() 30529 1726882598.10197: Calling all_inventory to load vars for managed_node1 30529 1726882598.10199: Calling groups_inventory to load vars for managed_node1 30529 1726882598.10202: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.10215: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.10217: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.10220: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.11110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.12175: done with get_vars() 30529 1726882598.12202: variable 'ansible_search_path' from source: unknown 30529 1726882598.12204: variable 'ansible_search_path' from source: unknown 30529 1726882598.12233: we have included files to process 30529 1726882598.12234: generating all_blocks data 30529 1726882598.12235: done generating all_blocks data 30529 1726882598.12239: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882598.12240: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882598.12241: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882598.12475: in VariableManager get_vars() 30529 1726882598.12507: done with get_vars() 30529 1726882598.12811: done processing included file 30529 1726882598.12816: iterating over new_blocks loaded from include file 30529 1726882598.12819: in VariableManager get_vars() 30529 1726882598.12834: done with get_vars() 30529 1726882598.12838: filtering new block on tags 30529 1726882598.12877: done filtering new block on tags 30529 1726882598.12879: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=tasks/assert_profile_present.yml) 30529 1726882598.12883: extending task lists for all hosts with included blocks 30529 1726882598.13694: done extending task lists 30529 1726882598.13695: done processing included files 30529 1726882598.13696: results queue empty 30529 1726882598.13696: checking for any_errors_fatal 30529 1726882598.13699: done checking for any_errors_fatal 30529 1726882598.13700: checking for max_fail_percentage 30529 1726882598.13700: done checking for max_fail_percentage 30529 1726882598.13701: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.13702: done checking to see if all hosts have failed 30529 1726882598.13702: getting the remaining hosts for this loop 30529 1726882598.13703: done getting the remaining hosts for this loop 30529 1726882598.13705: getting the next task for host managed_node1 30529 1726882598.13708: done getting next task for host managed_node1 30529 1726882598.13709: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882598.13711: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.13713: getting variables 30529 1726882598.13713: in VariableManager get_vars() 30529 1726882598.13719: Calling all_inventory to load vars for managed_node1 30529 1726882598.13720: Calling groups_inventory to load vars for managed_node1 30529 1726882598.13722: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.13726: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.13727: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.13729: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.14725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.16281: done with get_vars() 30529 1726882598.16297: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:36:38 -0400 (0:00:00.076) 0:00:12.189 ****** 30529 1726882598.16347: entering _queue_task() for managed_node1/include_tasks 30529 1726882598.16579: worker is 1 (out of 1 available) 30529 1726882598.16592: exiting _queue_task() for managed_node1/include_tasks 30529 1726882598.16607: done queuing things up, now waiting for results queue to drain 30529 1726882598.16608: waiting for pending results... 30529 1726882598.16798: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882598.16865: in run() - task 12673a56-9f93-b0f1-edc0-000000000383 30529 1726882598.16878: variable 'ansible_search_path' from source: unknown 30529 1726882598.16881: variable 'ansible_search_path' from source: unknown 30529 1726882598.16914: calling self._execute() 30529 1726882598.16979: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.16983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.16995: variable 'omit' from source: magic vars 30529 1726882598.17258: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.17271: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.17274: _execute() done 30529 1726882598.17277: dumping result to json 30529 1726882598.17279: done dumping result, returning 30529 1726882598.17287: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-000000000383] 30529 1726882598.17296: sending task result for task 12673a56-9f93-b0f1-edc0-000000000383 30529 1726882598.17372: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000383 30529 1726882598.17376: WORKER PROCESS EXITING 30529 1726882598.17407: no more pending results, returning what we have 30529 1726882598.17412: in VariableManager get_vars() 30529 1726882598.17444: Calling all_inventory to load vars for managed_node1 30529 1726882598.17446: Calling groups_inventory to load vars for managed_node1 30529 1726882598.17449: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.17461: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.17464: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.17466: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.18219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.19737: done with get_vars() 30529 1726882598.19752: variable 'ansible_search_path' from source: unknown 30529 1726882598.19754: variable 'ansible_search_path' from source: unknown 30529 1726882598.19761: variable 'item' from source: include params 30529 1726882598.19862: variable 'item' from source: include params 30529 1726882598.19894: we have included files to process 30529 1726882598.19896: generating all_blocks data 30529 1726882598.19898: done generating all_blocks data 30529 1726882598.19899: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882598.19900: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882598.19902: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882598.21680: done processing included file 30529 1726882598.21682: iterating over new_blocks loaded from include file 30529 1726882598.21683: in VariableManager get_vars() 30529 1726882598.21701: done with get_vars() 30529 1726882598.21703: filtering new block on tags 30529 1726882598.22013: done filtering new block on tags 30529 1726882598.22017: in VariableManager get_vars() 30529 1726882598.22031: done with get_vars() 30529 1726882598.22033: filtering new block on tags 30529 1726882598.22090: done filtering new block on tags 30529 1726882598.22280: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882598.22286: extending task lists for all hosts with included blocks 30529 1726882598.22656: done extending task lists 30529 1726882598.22657: done processing included files 30529 1726882598.22658: results queue empty 30529 1726882598.22658: checking for any_errors_fatal 30529 1726882598.22662: done checking for any_errors_fatal 30529 1726882598.22662: checking for max_fail_percentage 30529 1726882598.22663: done checking for max_fail_percentage 30529 1726882598.22664: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.22665: done checking to see if all hosts have failed 30529 1726882598.22666: getting the remaining hosts for this loop 30529 1726882598.22668: done getting the remaining hosts for this loop 30529 1726882598.22670: getting the next task for host managed_node1 30529 1726882598.22675: done getting next task for host managed_node1 30529 1726882598.22676: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882598.22679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.22682: getting variables 30529 1726882598.22683: in VariableManager get_vars() 30529 1726882598.22691: Calling all_inventory to load vars for managed_node1 30529 1726882598.22997: Calling groups_inventory to load vars for managed_node1 30529 1726882598.23000: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.23005: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.23008: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.23011: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.30057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.32400: done with get_vars() 30529 1726882598.32509: done getting variables 30529 1726882598.32557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:38 -0400 (0:00:00.162) 0:00:12.351 ****** 30529 1726882598.32588: entering _queue_task() for managed_node1/set_fact 30529 1726882598.33499: worker is 1 (out of 1 available) 30529 1726882598.33510: exiting _queue_task() for managed_node1/set_fact 30529 1726882598.33521: done queuing things up, now waiting for results queue to drain 30529 1726882598.33522: waiting for pending results... 30529 1726882598.34158: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882598.34283: in run() - task 12673a56-9f93-b0f1-edc0-0000000003fe 30529 1726882598.34306: variable 'ansible_search_path' from source: unknown 30529 1726882598.34310: variable 'ansible_search_path' from source: unknown 30529 1726882598.34347: calling self._execute() 30529 1726882598.34558: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.34564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.34580: variable 'omit' from source: magic vars 30529 1726882598.35448: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.35463: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.35470: variable 'omit' from source: magic vars 30529 1726882598.35530: variable 'omit' from source: magic vars 30529 1726882598.35678: variable 'omit' from source: magic vars 30529 1726882598.35720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882598.35755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882598.35897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882598.35915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.35929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.35960: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882598.35964: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.35967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.36195: Set connection var ansible_shell_executable to /bin/sh 30529 1726882598.36323: Set connection var ansible_pipelining to False 30529 1726882598.36327: Set connection var ansible_shell_type to sh 30529 1726882598.36338: Set connection var ansible_timeout to 10 30529 1726882598.36342: Set connection var ansible_connection to ssh 30529 1726882598.36347: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882598.36426: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.36430: variable 'ansible_connection' from source: unknown 30529 1726882598.36433: variable 'ansible_module_compression' from source: unknown 30529 1726882598.36435: variable 'ansible_shell_type' from source: unknown 30529 1726882598.36437: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.36438: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.36440: variable 'ansible_pipelining' from source: unknown 30529 1726882598.36465: variable 'ansible_timeout' from source: unknown 30529 1726882598.36468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.36755: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882598.36759: variable 'omit' from source: magic vars 30529 1726882598.36762: starting attempt loop 30529 1726882598.36764: running the handler 30529 1726882598.36766: handler run complete 30529 1726882598.37082: attempt loop complete, returning result 30529 1726882598.37085: _execute() done 30529 1726882598.37087: dumping result to json 30529 1726882598.37089: done dumping result, returning 30529 1726882598.37091: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-0000000003fe] 30529 1726882598.37184: sending task result for task 12673a56-9f93-b0f1-edc0-0000000003fe 30529 1726882598.37271: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000003fe 30529 1726882598.37274: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882598.37350: no more pending results, returning what we have 30529 1726882598.37354: results queue empty 30529 1726882598.37354: checking for any_errors_fatal 30529 1726882598.37356: done checking for any_errors_fatal 30529 1726882598.37357: checking for max_fail_percentage 30529 1726882598.37358: done checking for max_fail_percentage 30529 1726882598.37359: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.37360: done checking to see if all hosts have failed 30529 1726882598.37361: getting the remaining hosts for this loop 30529 1726882598.37363: done getting the remaining hosts for this loop 30529 1726882598.37368: getting the next task for host managed_node1 30529 1726882598.37377: done getting next task for host managed_node1 30529 1726882598.37380: ^ task is: TASK: Stat profile file 30529 1726882598.37388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.37398: getting variables 30529 1726882598.37401: in VariableManager get_vars() 30529 1726882598.37435: Calling all_inventory to load vars for managed_node1 30529 1726882598.37438: Calling groups_inventory to load vars for managed_node1 30529 1726882598.37442: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.37454: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.37457: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.37460: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.40451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.43973: done with get_vars() 30529 1726882598.44114: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:38 -0400 (0:00:00.117) 0:00:12.469 ****** 30529 1726882598.44336: entering _queue_task() for managed_node1/stat 30529 1726882598.45107: worker is 1 (out of 1 available) 30529 1726882598.45120: exiting _queue_task() for managed_node1/stat 30529 1726882598.45133: done queuing things up, now waiting for results queue to drain 30529 1726882598.45134: waiting for pending results... 30529 1726882598.45714: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882598.45833: in run() - task 12673a56-9f93-b0f1-edc0-0000000003ff 30529 1726882598.45859: variable 'ansible_search_path' from source: unknown 30529 1726882598.45868: variable 'ansible_search_path' from source: unknown 30529 1726882598.46037: calling self._execute() 30529 1726882598.46225: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.46229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.46243: variable 'omit' from source: magic vars 30529 1726882598.47120: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.47124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.47126: variable 'omit' from source: magic vars 30529 1726882598.47129: variable 'omit' from source: magic vars 30529 1726882598.47460: variable 'profile' from source: play vars 30529 1726882598.47463: variable 'interface' from source: play vars 30529 1726882598.47467: variable 'interface' from source: play vars 30529 1726882598.47723: variable 'omit' from source: magic vars 30529 1726882598.47761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882598.47800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882598.47827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882598.47846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.47858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.47888: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882598.47990: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.47997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.48156: Set connection var ansible_shell_executable to /bin/sh 30529 1726882598.48161: Set connection var ansible_pipelining to False 30529 1726882598.48164: Set connection var ansible_shell_type to sh 30529 1726882598.48174: Set connection var ansible_timeout to 10 30529 1726882598.48177: Set connection var ansible_connection to ssh 30529 1726882598.48183: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882598.48212: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.48216: variable 'ansible_connection' from source: unknown 30529 1726882598.48219: variable 'ansible_module_compression' from source: unknown 30529 1726882598.48221: variable 'ansible_shell_type' from source: unknown 30529 1726882598.48223: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.48226: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.48229: variable 'ansible_pipelining' from source: unknown 30529 1726882598.48231: variable 'ansible_timeout' from source: unknown 30529 1726882598.48234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.48699: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882598.48703: variable 'omit' from source: magic vars 30529 1726882598.48902: starting attempt loop 30529 1726882598.48906: running the handler 30529 1726882598.48908: _low_level_execute_command(): starting 30529 1726882598.48911: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882598.50316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.50469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.50516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.50656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.52411: stdout chunk (state=3): >>>/root <<< 30529 1726882598.52459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.52496: stdout chunk (state=3): >>><<< 30529 1726882598.52500: stderr chunk (state=3): >>><<< 30529 1726882598.52708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.52713: _low_level_execute_command(): starting 30529 1726882598.52716: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709 `" && echo ansible-tmp-1726882598.5262213-31077-206792068316709="` echo /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709 `" ) && sleep 0' 30529 1726882598.53785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882598.53790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882598.53795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.53798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882598.53927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882598.53938: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882598.53941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.53943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882598.53946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.54067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.54085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.54159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.56020: stdout chunk (state=3): >>>ansible-tmp-1726882598.5262213-31077-206792068316709=/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709 <<< 30529 1726882598.56229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.56233: stdout chunk (state=3): >>><<< 30529 1726882598.56235: stderr chunk (state=3): >>><<< 30529 1726882598.56253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882598.5262213-31077-206792068316709=/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.56309: variable 'ansible_module_compression' from source: unknown 30529 1726882598.56485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882598.56574: variable 'ansible_facts' from source: unknown 30529 1726882598.56743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py 30529 1726882598.57028: Sending initial data 30529 1726882598.57031: Sent initial data (153 bytes) 30529 1726882598.57761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882598.57969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.57973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.57976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.57979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.57981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882598.57983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.58107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.58197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.59645: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882598.59659: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882598.59738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882598.59841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpzz2c06sy /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py <<< 30529 1726882598.59844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py" <<< 30529 1726882598.59882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpzz2c06sy" to remote "/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py" <<< 30529 1726882598.61002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.61067: stdout chunk (state=3): >>><<< 30529 1726882598.61070: stderr chunk (state=3): >>><<< 30529 1726882598.61073: done transferring module to remote 30529 1726882598.61075: _low_level_execute_command(): starting 30529 1726882598.61077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/ /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py && sleep 0' 30529 1726882598.61807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.61861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882598.61880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.61897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.61974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.63703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.63706: stdout chunk (state=3): >>><<< 30529 1726882598.63709: stderr chunk (state=3): >>><<< 30529 1726882598.63721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.63725: _low_level_execute_command(): starting 30529 1726882598.63729: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/AnsiballZ_stat.py && sleep 0' 30529 1726882598.64144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882598.64147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882598.64149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.64152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.64154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.64192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.64215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.64257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.79268: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882598.80490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882598.80523: stderr chunk (state=3): >>><<< 30529 1726882598.80526: stdout chunk (state=3): >>><<< 30529 1726882598.80550: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882598.80591: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882598.80600: _low_level_execute_command(): starting 30529 1726882598.80605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882598.5262213-31077-206792068316709/ > /dev/null 2>&1 && sleep 0' 30529 1726882598.81113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.81117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.81131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.81191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882598.81200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.81202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.81244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.83037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.83059: stderr chunk (state=3): >>><<< 30529 1726882598.83062: stdout chunk (state=3): >>><<< 30529 1726882598.83076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.83081: handler run complete 30529 1726882598.83101: attempt loop complete, returning result 30529 1726882598.83104: _execute() done 30529 1726882598.83106: dumping result to json 30529 1726882598.83108: done dumping result, returning 30529 1726882598.83118: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-0000000003ff] 30529 1726882598.83120: sending task result for task 12673a56-9f93-b0f1-edc0-0000000003ff 30529 1726882598.83217: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000003ff 30529 1726882598.83220: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882598.83275: no more pending results, returning what we have 30529 1726882598.83277: results queue empty 30529 1726882598.83278: checking for any_errors_fatal 30529 1726882598.83284: done checking for any_errors_fatal 30529 1726882598.83284: checking for max_fail_percentage 30529 1726882598.83286: done checking for max_fail_percentage 30529 1726882598.83287: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.83288: done checking to see if all hosts have failed 30529 1726882598.83288: getting the remaining hosts for this loop 30529 1726882598.83290: done getting the remaining hosts for this loop 30529 1726882598.83301: getting the next task for host managed_node1 30529 1726882598.83309: done getting next task for host managed_node1 30529 1726882598.83311: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882598.83316: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.83319: getting variables 30529 1726882598.83321: in VariableManager get_vars() 30529 1726882598.83352: Calling all_inventory to load vars for managed_node1 30529 1726882598.83354: Calling groups_inventory to load vars for managed_node1 30529 1726882598.83358: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.83368: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.83371: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.83373: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.84462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.85381: done with get_vars() 30529 1726882598.85401: done getting variables 30529 1726882598.85443: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:38 -0400 (0:00:00.411) 0:00:12.880 ****** 30529 1726882598.85466: entering _queue_task() for managed_node1/set_fact 30529 1726882598.85697: worker is 1 (out of 1 available) 30529 1726882598.85708: exiting _queue_task() for managed_node1/set_fact 30529 1726882598.85722: done queuing things up, now waiting for results queue to drain 30529 1726882598.85724: waiting for pending results... 30529 1726882598.85898: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882598.85983: in run() - task 12673a56-9f93-b0f1-edc0-000000000400 30529 1726882598.86000: variable 'ansible_search_path' from source: unknown 30529 1726882598.86006: variable 'ansible_search_path' from source: unknown 30529 1726882598.86040: calling self._execute() 30529 1726882598.86107: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.86116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.86131: variable 'omit' from source: magic vars 30529 1726882598.86471: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.86474: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.86562: variable 'profile_stat' from source: set_fact 30529 1726882598.86579: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882598.86582: when evaluation is False, skipping this task 30529 1726882598.86585: _execute() done 30529 1726882598.86588: dumping result to json 30529 1726882598.86605: done dumping result, returning 30529 1726882598.86608: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-000000000400] 30529 1726882598.86612: sending task result for task 12673a56-9f93-b0f1-edc0-000000000400 30529 1726882598.86711: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000400 30529 1726882598.86715: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882598.86796: no more pending results, returning what we have 30529 1726882598.86800: results queue empty 30529 1726882598.86801: checking for any_errors_fatal 30529 1726882598.86806: done checking for any_errors_fatal 30529 1726882598.86807: checking for max_fail_percentage 30529 1726882598.86808: done checking for max_fail_percentage 30529 1726882598.86809: checking to see if all hosts have failed and the running result is not ok 30529 1726882598.86810: done checking to see if all hosts have failed 30529 1726882598.86810: getting the remaining hosts for this loop 30529 1726882598.86811: done getting the remaining hosts for this loop 30529 1726882598.86815: getting the next task for host managed_node1 30529 1726882598.86821: done getting next task for host managed_node1 30529 1726882598.86822: ^ task is: TASK: Get NM profile info 30529 1726882598.86827: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882598.86830: getting variables 30529 1726882598.86831: in VariableManager get_vars() 30529 1726882598.86857: Calling all_inventory to load vars for managed_node1 30529 1726882598.86860: Calling groups_inventory to load vars for managed_node1 30529 1726882598.86862: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882598.86871: Calling all_plugins_play to load vars for managed_node1 30529 1726882598.86873: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882598.86876: Calling groups_plugins_play to load vars for managed_node1 30529 1726882598.87646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882598.88847: done with get_vars() 30529 1726882598.88860: done getting variables 30529 1726882598.88951: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:38 -0400 (0:00:00.035) 0:00:12.915 ****** 30529 1726882598.88980: entering _queue_task() for managed_node1/shell 30529 1726882598.88983: Creating lock for shell 30529 1726882598.89220: worker is 1 (out of 1 available) 30529 1726882598.89232: exiting _queue_task() for managed_node1/shell 30529 1726882598.89243: done queuing things up, now waiting for results queue to drain 30529 1726882598.89245: waiting for pending results... 30529 1726882598.89443: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882598.89563: in run() - task 12673a56-9f93-b0f1-edc0-000000000401 30529 1726882598.89576: variable 'ansible_search_path' from source: unknown 30529 1726882598.89579: variable 'ansible_search_path' from source: unknown 30529 1726882598.89608: calling self._execute() 30529 1726882598.89710: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.89713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.89733: variable 'omit' from source: magic vars 30529 1726882598.89989: variable 'ansible_distribution_major_version' from source: facts 30529 1726882598.90003: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882598.90008: variable 'omit' from source: magic vars 30529 1726882598.90041: variable 'omit' from source: magic vars 30529 1726882598.90113: variable 'profile' from source: play vars 30529 1726882598.90117: variable 'interface' from source: play vars 30529 1726882598.90163: variable 'interface' from source: play vars 30529 1726882598.90184: variable 'omit' from source: magic vars 30529 1726882598.90224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882598.90247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882598.90262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882598.90276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.90288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882598.90314: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882598.90317: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.90320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.90383: Set connection var ansible_shell_executable to /bin/sh 30529 1726882598.90402: Set connection var ansible_pipelining to False 30529 1726882598.90407: Set connection var ansible_shell_type to sh 30529 1726882598.90410: Set connection var ansible_timeout to 10 30529 1726882598.90413: Set connection var ansible_connection to ssh 30529 1726882598.90416: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882598.90442: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.90446: variable 'ansible_connection' from source: unknown 30529 1726882598.90448: variable 'ansible_module_compression' from source: unknown 30529 1726882598.90451: variable 'ansible_shell_type' from source: unknown 30529 1726882598.90453: variable 'ansible_shell_executable' from source: unknown 30529 1726882598.90455: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882598.90457: variable 'ansible_pipelining' from source: unknown 30529 1726882598.90460: variable 'ansible_timeout' from source: unknown 30529 1726882598.90462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882598.90557: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882598.90565: variable 'omit' from source: magic vars 30529 1726882598.90570: starting attempt loop 30529 1726882598.90573: running the handler 30529 1726882598.90581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882598.90600: _low_level_execute_command(): starting 30529 1726882598.90606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882598.91280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882598.91285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882598.91288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.91361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882598.91365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.91404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.92967: stdout chunk (state=3): >>>/root <<< 30529 1726882598.93064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.93091: stderr chunk (state=3): >>><<< 30529 1726882598.93097: stdout chunk (state=3): >>><<< 30529 1726882598.93121: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.93127: _low_level_execute_command(): starting 30529 1726882598.93133: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257 `" && echo ansible-tmp-1726882598.9311507-31098-4049400685257="` echo /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257 `" ) && sleep 0' 30529 1726882598.94164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882598.94311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.94334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.94410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.96253: stdout chunk (state=3): >>>ansible-tmp-1726882598.9311507-31098-4049400685257=/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257 <<< 30529 1726882598.96361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.96381: stderr chunk (state=3): >>><<< 30529 1726882598.96384: stdout chunk (state=3): >>><<< 30529 1726882598.96400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882598.9311507-31098-4049400685257=/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882598.96428: variable 'ansible_module_compression' from source: unknown 30529 1726882598.96468: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882598.96499: variable 'ansible_facts' from source: unknown 30529 1726882598.96555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py 30529 1726882598.96649: Sending initial data 30529 1726882598.96653: Sent initial data (154 bytes) 30529 1726882598.97173: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882598.97187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882598.97213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882598.97284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882598.98772: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882598.98804: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882598.98855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882598.98922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpnc2nffoz /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py <<< 30529 1726882598.98927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py" <<< 30529 1726882598.98976: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpnc2nffoz" to remote "/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py" <<< 30529 1726882598.99724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882598.99900: stderr chunk (state=3): >>><<< 30529 1726882598.99903: stdout chunk (state=3): >>><<< 30529 1726882598.99906: done transferring module to remote 30529 1726882598.99908: _low_level_execute_command(): starting 30529 1726882598.99911: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/ /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py && sleep 0' 30529 1726882599.00437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882599.00452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882599.00510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882599.00571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882599.00595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882599.00737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882599.00772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882599.02571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882599.02575: stdout chunk (state=3): >>><<< 30529 1726882599.02699: stderr chunk (state=3): >>><<< 30529 1726882599.02706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882599.02709: _low_level_execute_command(): starting 30529 1726882599.02711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/AnsiballZ_command.py && sleep 0' 30529 1726882599.03195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882599.03211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882599.03222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882599.03235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882599.03253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882599.03356: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882599.03377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882599.03405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882599.03503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882599.03526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882599.03540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882599.03632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882599.20408: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:36:39.184865", "end": "2024-09-20 21:36:39.201935", "delta": "0:00:00.017070", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882599.21783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882599.21899: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882599.21903: stdout chunk (state=3): >>><<< 30529 1726882599.21905: stderr chunk (state=3): >>><<< 30529 1726882599.21924: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:36:39.184865", "end": "2024-09-20 21:36:39.201935", "delta": "0:00:00.017070", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882599.22143: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882599.22147: _low_level_execute_command(): starting 30529 1726882599.22149: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882598.9311507-31098-4049400685257/ > /dev/null 2>&1 && sleep 0' 30529 1726882599.23205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882599.23410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882599.23423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882599.23434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882599.23502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882599.23506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882599.23607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882599.23682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882599.25744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882599.25748: stdout chunk (state=3): >>><<< 30529 1726882599.25755: stderr chunk (state=3): >>><<< 30529 1726882599.25775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882599.25778: handler run complete 30529 1726882599.25804: Evaluated conditional (False): False 30529 1726882599.25815: attempt loop complete, returning result 30529 1726882599.25818: _execute() done 30529 1726882599.25820: dumping result to json 30529 1726882599.25825: done dumping result, returning 30529 1726882599.25834: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-000000000401] 30529 1726882599.25836: sending task result for task 12673a56-9f93-b0f1-edc0-000000000401 ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017070", "end": "2024-09-20 21:36:39.201935", "rc": 0, "start": "2024-09-20 21:36:39.184865" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30529 1726882599.26192: no more pending results, returning what we have 30529 1726882599.26198: results queue empty 30529 1726882599.26199: checking for any_errors_fatal 30529 1726882599.26206: done checking for any_errors_fatal 30529 1726882599.26207: checking for max_fail_percentage 30529 1726882599.26208: done checking for max_fail_percentage 30529 1726882599.26209: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.26210: done checking to see if all hosts have failed 30529 1726882599.26211: getting the remaining hosts for this loop 30529 1726882599.26212: done getting the remaining hosts for this loop 30529 1726882599.26216: getting the next task for host managed_node1 30529 1726882599.26223: done getting next task for host managed_node1 30529 1726882599.26226: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882599.26231: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.26235: getting variables 30529 1726882599.26237: in VariableManager get_vars() 30529 1726882599.26266: Calling all_inventory to load vars for managed_node1 30529 1726882599.26268: Calling groups_inventory to load vars for managed_node1 30529 1726882599.26272: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.26282: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.26285: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.26290: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.27360: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000401 30529 1726882599.27364: WORKER PROCESS EXITING 30529 1726882599.29425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.31338: done with get_vars() 30529 1726882599.31474: done getting variables 30529 1726882599.31733: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:39 -0400 (0:00:00.427) 0:00:13.343 ****** 30529 1726882599.31766: entering _queue_task() for managed_node1/set_fact 30529 1726882599.32380: worker is 1 (out of 1 available) 30529 1726882599.32397: exiting _queue_task() for managed_node1/set_fact 30529 1726882599.32412: done queuing things up, now waiting for results queue to drain 30529 1726882599.32414: waiting for pending results... 30529 1726882599.32758: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882599.32874: in run() - task 12673a56-9f93-b0f1-edc0-000000000402 30529 1726882599.32901: variable 'ansible_search_path' from source: unknown 30529 1726882599.32905: variable 'ansible_search_path' from source: unknown 30529 1726882599.33130: calling self._execute() 30529 1726882599.33242: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.33253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.33267: variable 'omit' from source: magic vars 30529 1726882599.33675: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.33764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.33845: variable 'nm_profile_exists' from source: set_fact 30529 1726882599.33869: Evaluated conditional (nm_profile_exists.rc == 0): True 30529 1726882599.33890: variable 'omit' from source: magic vars 30529 1726882599.33950: variable 'omit' from source: magic vars 30529 1726882599.34004: variable 'omit' from source: magic vars 30529 1726882599.34100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882599.34105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882599.34127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882599.34151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.34168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.34214: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882599.34223: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.34229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.34346: Set connection var ansible_shell_executable to /bin/sh 30529 1726882599.34421: Set connection var ansible_pipelining to False 30529 1726882599.34424: Set connection var ansible_shell_type to sh 30529 1726882599.34427: Set connection var ansible_timeout to 10 30529 1726882599.34429: Set connection var ansible_connection to ssh 30529 1726882599.34431: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882599.34433: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.34435: variable 'ansible_connection' from source: unknown 30529 1726882599.34437: variable 'ansible_module_compression' from source: unknown 30529 1726882599.34439: variable 'ansible_shell_type' from source: unknown 30529 1726882599.34441: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.34498: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.34502: variable 'ansible_pipelining' from source: unknown 30529 1726882599.34504: variable 'ansible_timeout' from source: unknown 30529 1726882599.34507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.34627: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882599.34651: variable 'omit' from source: magic vars 30529 1726882599.34665: starting attempt loop 30529 1726882599.34674: running the handler 30529 1726882599.34695: handler run complete 30529 1726882599.34746: attempt loop complete, returning result 30529 1726882599.34749: _execute() done 30529 1726882599.34751: dumping result to json 30529 1726882599.34753: done dumping result, returning 30529 1726882599.34756: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-000000000402] 30529 1726882599.34758: sending task result for task 12673a56-9f93-b0f1-edc0-000000000402 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30529 1726882599.35025: no more pending results, returning what we have 30529 1726882599.35028: results queue empty 30529 1726882599.35030: checking for any_errors_fatal 30529 1726882599.35037: done checking for any_errors_fatal 30529 1726882599.35037: checking for max_fail_percentage 30529 1726882599.35039: done checking for max_fail_percentage 30529 1726882599.35040: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.35041: done checking to see if all hosts have failed 30529 1726882599.35042: getting the remaining hosts for this loop 30529 1726882599.35044: done getting the remaining hosts for this loop 30529 1726882599.35048: getting the next task for host managed_node1 30529 1726882599.35060: done getting next task for host managed_node1 30529 1726882599.35063: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882599.35069: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.35073: getting variables 30529 1726882599.35075: in VariableManager get_vars() 30529 1726882599.35110: Calling all_inventory to load vars for managed_node1 30529 1726882599.35113: Calling groups_inventory to load vars for managed_node1 30529 1726882599.35116: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.35127: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.35130: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.35133: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.35942: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000402 30529 1726882599.35946: WORKER PROCESS EXITING 30529 1726882599.38207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.40909: done with get_vars() 30529 1726882599.40936: done getting variables 30529 1726882599.41014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.41145: variable 'profile' from source: play vars 30529 1726882599.41149: variable 'interface' from source: play vars 30529 1726882599.41223: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:39 -0400 (0:00:00.094) 0:00:13.438 ****** 30529 1726882599.41257: entering _queue_task() for managed_node1/command 30529 1726882599.41744: worker is 1 (out of 1 available) 30529 1726882599.41756: exiting _queue_task() for managed_node1/command 30529 1726882599.41768: done queuing things up, now waiting for results queue to drain 30529 1726882599.41769: waiting for pending results... 30529 1726882599.42371: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882599.42617: in run() - task 12673a56-9f93-b0f1-edc0-000000000404 30529 1726882599.42669: variable 'ansible_search_path' from source: unknown 30529 1726882599.42740: variable 'ansible_search_path' from source: unknown 30529 1726882599.42790: calling self._execute() 30529 1726882599.42884: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.42900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.42915: variable 'omit' from source: magic vars 30529 1726882599.43316: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.43335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.43575: variable 'profile_stat' from source: set_fact 30529 1726882599.43642: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882599.43650: when evaluation is False, skipping this task 30529 1726882599.43656: _execute() done 30529 1726882599.43662: dumping result to json 30529 1726882599.43670: done dumping result, returning 30529 1726882599.43953: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000404] 30529 1726882599.43956: sending task result for task 12673a56-9f93-b0f1-edc0-000000000404 30529 1726882599.44029: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000404 30529 1726882599.44032: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882599.44111: no more pending results, returning what we have 30529 1726882599.44115: results queue empty 30529 1726882599.44116: checking for any_errors_fatal 30529 1726882599.44123: done checking for any_errors_fatal 30529 1726882599.44124: checking for max_fail_percentage 30529 1726882599.44125: done checking for max_fail_percentage 30529 1726882599.44126: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.44127: done checking to see if all hosts have failed 30529 1726882599.44128: getting the remaining hosts for this loop 30529 1726882599.44130: done getting the remaining hosts for this loop 30529 1726882599.44134: getting the next task for host managed_node1 30529 1726882599.44142: done getting next task for host managed_node1 30529 1726882599.44144: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882599.44149: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.44153: getting variables 30529 1726882599.44156: in VariableManager get_vars() 30529 1726882599.44222: Calling all_inventory to load vars for managed_node1 30529 1726882599.44225: Calling groups_inventory to load vars for managed_node1 30529 1726882599.44229: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.44244: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.44248: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.44251: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.46943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.49526: done with get_vars() 30529 1726882599.49550: done getting variables 30529 1726882599.49619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.50081: variable 'profile' from source: play vars 30529 1726882599.50087: variable 'interface' from source: play vars 30529 1726882599.50151: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:39 -0400 (0:00:00.089) 0:00:13.527 ****** 30529 1726882599.50184: entering _queue_task() for managed_node1/set_fact 30529 1726882599.50525: worker is 1 (out of 1 available) 30529 1726882599.50650: exiting _queue_task() for managed_node1/set_fact 30529 1726882599.50662: done queuing things up, now waiting for results queue to drain 30529 1726882599.50663: waiting for pending results... 30529 1726882599.50846: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882599.50981: in run() - task 12673a56-9f93-b0f1-edc0-000000000405 30529 1726882599.51012: variable 'ansible_search_path' from source: unknown 30529 1726882599.51085: variable 'ansible_search_path' from source: unknown 30529 1726882599.51091: calling self._execute() 30529 1726882599.51154: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.51165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.51180: variable 'omit' from source: magic vars 30529 1726882599.51953: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.51957: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.52084: variable 'profile_stat' from source: set_fact 30529 1726882599.52128: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882599.52136: when evaluation is False, skipping this task 30529 1726882599.52176: _execute() done 30529 1726882599.52184: dumping result to json 30529 1726882599.52198: done dumping result, returning 30529 1726882599.52210: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000405] 30529 1726882599.52498: sending task result for task 12673a56-9f93-b0f1-edc0-000000000405 30529 1726882599.52568: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000405 30529 1726882599.52571: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882599.52647: no more pending results, returning what we have 30529 1726882599.52652: results queue empty 30529 1726882599.52653: checking for any_errors_fatal 30529 1726882599.52659: done checking for any_errors_fatal 30529 1726882599.52660: checking for max_fail_percentage 30529 1726882599.52662: done checking for max_fail_percentage 30529 1726882599.52663: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.52664: done checking to see if all hosts have failed 30529 1726882599.52664: getting the remaining hosts for this loop 30529 1726882599.52666: done getting the remaining hosts for this loop 30529 1726882599.52671: getting the next task for host managed_node1 30529 1726882599.52680: done getting next task for host managed_node1 30529 1726882599.52683: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882599.52692: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.52699: getting variables 30529 1726882599.52706: in VariableManager get_vars() 30529 1726882599.52739: Calling all_inventory to load vars for managed_node1 30529 1726882599.52742: Calling groups_inventory to load vars for managed_node1 30529 1726882599.52746: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.52760: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.52764: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.52767: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.55466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.57927: done with get_vars() 30529 1726882599.57957: done getting variables 30529 1726882599.58023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.58144: variable 'profile' from source: play vars 30529 1726882599.58147: variable 'interface' from source: play vars 30529 1726882599.58215: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:39 -0400 (0:00:00.080) 0:00:13.608 ****** 30529 1726882599.58246: entering _queue_task() for managed_node1/command 30529 1726882599.58713: worker is 1 (out of 1 available) 30529 1726882599.58725: exiting _queue_task() for managed_node1/command 30529 1726882599.58737: done queuing things up, now waiting for results queue to drain 30529 1726882599.58739: waiting for pending results... 30529 1726882599.58943: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882599.59137: in run() - task 12673a56-9f93-b0f1-edc0-000000000406 30529 1726882599.59141: variable 'ansible_search_path' from source: unknown 30529 1726882599.59144: variable 'ansible_search_path' from source: unknown 30529 1726882599.59160: calling self._execute() 30529 1726882599.59258: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.59267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.59279: variable 'omit' from source: magic vars 30529 1726882599.60046: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.60050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.60201: variable 'profile_stat' from source: set_fact 30529 1726882599.60330: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882599.60333: when evaluation is False, skipping this task 30529 1726882599.60336: _execute() done 30529 1726882599.60342: dumping result to json 30529 1726882599.60345: done dumping result, returning 30529 1726882599.60354: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000406] 30529 1726882599.60359: sending task result for task 12673a56-9f93-b0f1-edc0-000000000406 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882599.60515: no more pending results, returning what we have 30529 1726882599.60520: results queue empty 30529 1726882599.60521: checking for any_errors_fatal 30529 1726882599.60536: done checking for any_errors_fatal 30529 1726882599.60537: checking for max_fail_percentage 30529 1726882599.60539: done checking for max_fail_percentage 30529 1726882599.60540: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.60541: done checking to see if all hosts have failed 30529 1726882599.60542: getting the remaining hosts for this loop 30529 1726882599.60544: done getting the remaining hosts for this loop 30529 1726882599.60549: getting the next task for host managed_node1 30529 1726882599.60558: done getting next task for host managed_node1 30529 1726882599.60561: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882599.60567: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.60572: getting variables 30529 1726882599.60574: in VariableManager get_vars() 30529 1726882599.60612: Calling all_inventory to load vars for managed_node1 30529 1726882599.60614: Calling groups_inventory to load vars for managed_node1 30529 1726882599.60619: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.60635: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.60911: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.60917: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.61502: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000406 30529 1726882599.62347: WORKER PROCESS EXITING 30529 1726882599.63269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.66218: done with get_vars() 30529 1726882599.66244: done getting variables 30529 1726882599.66317: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.66439: variable 'profile' from source: play vars 30529 1726882599.66443: variable 'interface' from source: play vars 30529 1726882599.66512: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:39 -0400 (0:00:00.082) 0:00:13.691 ****** 30529 1726882599.66543: entering _queue_task() for managed_node1/set_fact 30529 1726882599.67261: worker is 1 (out of 1 available) 30529 1726882599.67272: exiting _queue_task() for managed_node1/set_fact 30529 1726882599.67283: done queuing things up, now waiting for results queue to drain 30529 1726882599.67287: waiting for pending results... 30529 1726882599.67699: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882599.68799: in run() - task 12673a56-9f93-b0f1-edc0-000000000407 30529 1726882599.68803: variable 'ansible_search_path' from source: unknown 30529 1726882599.68806: variable 'ansible_search_path' from source: unknown 30529 1726882599.68809: calling self._execute() 30529 1726882599.68812: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.68815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.68818: variable 'omit' from source: magic vars 30529 1726882599.69617: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.69635: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.69915: variable 'profile_stat' from source: set_fact 30529 1726882599.69934: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882599.69943: when evaluation is False, skipping this task 30529 1726882599.69950: _execute() done 30529 1726882599.69956: dumping result to json 30529 1726882599.69964: done dumping result, returning 30529 1726882599.70402: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000407] 30529 1726882599.70405: sending task result for task 12673a56-9f93-b0f1-edc0-000000000407 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882599.70515: no more pending results, returning what we have 30529 1726882599.70518: results queue empty 30529 1726882599.70519: checking for any_errors_fatal 30529 1726882599.70523: done checking for any_errors_fatal 30529 1726882599.70523: checking for max_fail_percentage 30529 1726882599.70525: done checking for max_fail_percentage 30529 1726882599.70527: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.70528: done checking to see if all hosts have failed 30529 1726882599.70529: getting the remaining hosts for this loop 30529 1726882599.70530: done getting the remaining hosts for this loop 30529 1726882599.70533: getting the next task for host managed_node1 30529 1726882599.70541: done getting next task for host managed_node1 30529 1726882599.70544: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30529 1726882599.70547: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.70551: getting variables 30529 1726882599.70553: in VariableManager get_vars() 30529 1726882599.70579: Calling all_inventory to load vars for managed_node1 30529 1726882599.70582: Calling groups_inventory to load vars for managed_node1 30529 1726882599.70584: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.70598: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.70601: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.70605: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.71302: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000407 30529 1726882599.71306: WORKER PROCESS EXITING 30529 1726882599.73708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.75942: done with get_vars() 30529 1726882599.75965: done getting variables 30529 1726882599.76197: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.76383: variable 'profile' from source: play vars 30529 1726882599.76390: variable 'interface' from source: play vars 30529 1726882599.76528: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:36:39 -0400 (0:00:00.100) 0:00:13.791 ****** 30529 1726882599.76561: entering _queue_task() for managed_node1/assert 30529 1726882599.77270: worker is 1 (out of 1 available) 30529 1726882599.77283: exiting _queue_task() for managed_node1/assert 30529 1726882599.77302: done queuing things up, now waiting for results queue to drain 30529 1726882599.77304: waiting for pending results... 30529 1726882599.77726: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' 30529 1726882599.77946: in run() - task 12673a56-9f93-b0f1-edc0-000000000384 30529 1726882599.77964: variable 'ansible_search_path' from source: unknown 30529 1726882599.78099: variable 'ansible_search_path' from source: unknown 30529 1726882599.78105: calling self._execute() 30529 1726882599.78353: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.78356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.78359: variable 'omit' from source: magic vars 30529 1726882599.79087: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.79213: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.79330: variable 'omit' from source: magic vars 30529 1726882599.79333: variable 'omit' from source: magic vars 30529 1726882599.80098: variable 'profile' from source: play vars 30529 1726882599.80101: variable 'interface' from source: play vars 30529 1726882599.80103: variable 'interface' from source: play vars 30529 1726882599.80105: variable 'omit' from source: magic vars 30529 1726882599.80106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882599.80109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882599.80111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882599.80698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.80701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.80704: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882599.80706: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.80709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.80711: Set connection var ansible_shell_executable to /bin/sh 30529 1726882599.80714: Set connection var ansible_pipelining to False 30529 1726882599.80716: Set connection var ansible_shell_type to sh 30529 1726882599.80719: Set connection var ansible_timeout to 10 30529 1726882599.80721: Set connection var ansible_connection to ssh 30529 1726882599.80724: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882599.80932: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.80939: variable 'ansible_connection' from source: unknown 30529 1726882599.80948: variable 'ansible_module_compression' from source: unknown 30529 1726882599.80954: variable 'ansible_shell_type' from source: unknown 30529 1726882599.80960: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.80965: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.80972: variable 'ansible_pipelining' from source: unknown 30529 1726882599.80978: variable 'ansible_timeout' from source: unknown 30529 1726882599.80984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.81124: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882599.81312: variable 'omit' from source: magic vars 30529 1726882599.81324: starting attempt loop 30529 1726882599.81331: running the handler 30529 1726882599.81439: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882599.81698: Evaluated conditional (lsr_net_profile_exists): True 30529 1726882599.81701: handler run complete 30529 1726882599.81704: attempt loop complete, returning result 30529 1726882599.81707: _execute() done 30529 1726882599.81710: dumping result to json 30529 1726882599.81713: done dumping result, returning 30529 1726882599.81715: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' [12673a56-9f93-b0f1-edc0-000000000384] 30529 1726882599.81718: sending task result for task 12673a56-9f93-b0f1-edc0-000000000384 30529 1726882599.81787: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000384 30529 1726882599.81790: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882599.82036: no more pending results, returning what we have 30529 1726882599.82039: results queue empty 30529 1726882599.82040: checking for any_errors_fatal 30529 1726882599.82045: done checking for any_errors_fatal 30529 1726882599.82046: checking for max_fail_percentage 30529 1726882599.82048: done checking for max_fail_percentage 30529 1726882599.82049: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.82050: done checking to see if all hosts have failed 30529 1726882599.82050: getting the remaining hosts for this loop 30529 1726882599.82052: done getting the remaining hosts for this loop 30529 1726882599.82055: getting the next task for host managed_node1 30529 1726882599.82062: done getting next task for host managed_node1 30529 1726882599.82064: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30529 1726882599.82068: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.82072: getting variables 30529 1726882599.82073: in VariableManager get_vars() 30529 1726882599.82105: Calling all_inventory to load vars for managed_node1 30529 1726882599.82108: Calling groups_inventory to load vars for managed_node1 30529 1726882599.82111: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.82121: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.82123: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.82126: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.84635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.87920: done with get_vars() 30529 1726882599.87942: done getting variables 30529 1726882599.88203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.88325: variable 'profile' from source: play vars 30529 1726882599.88329: variable 'interface' from source: play vars 30529 1726882599.88391: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:36:39 -0400 (0:00:00.120) 0:00:13.912 ****** 30529 1726882599.88633: entering _queue_task() for managed_node1/assert 30529 1726882599.89622: worker is 1 (out of 1 available) 30529 1726882599.89632: exiting _queue_task() for managed_node1/assert 30529 1726882599.89642: done queuing things up, now waiting for results queue to drain 30529 1726882599.89644: waiting for pending results... 30529 1726882599.89860: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' 30529 1726882599.90075: in run() - task 12673a56-9f93-b0f1-edc0-000000000385 30529 1726882599.90095: variable 'ansible_search_path' from source: unknown 30529 1726882599.90099: variable 'ansible_search_path' from source: unknown 30529 1726882599.90257: calling self._execute() 30529 1726882599.90421: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.90427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.90680: variable 'omit' from source: magic vars 30529 1726882599.91127: variable 'ansible_distribution_major_version' from source: facts 30529 1726882599.91130: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882599.91132: variable 'omit' from source: magic vars 30529 1726882599.91599: variable 'omit' from source: magic vars 30529 1726882599.91602: variable 'profile' from source: play vars 30529 1726882599.91604: variable 'interface' from source: play vars 30529 1726882599.91606: variable 'interface' from source: play vars 30529 1726882599.91628: variable 'omit' from source: magic vars 30529 1726882599.91670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882599.91721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882599.91740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882599.91757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.91770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882599.91803: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882599.91954: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.91957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.92083: Set connection var ansible_shell_executable to /bin/sh 30529 1726882599.92172: Set connection var ansible_pipelining to False 30529 1726882599.92175: Set connection var ansible_shell_type to sh 30529 1726882599.92177: Set connection var ansible_timeout to 10 30529 1726882599.92179: Set connection var ansible_connection to ssh 30529 1726882599.92181: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882599.92192: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.92201: variable 'ansible_connection' from source: unknown 30529 1726882599.92204: variable 'ansible_module_compression' from source: unknown 30529 1726882599.92206: variable 'ansible_shell_type' from source: unknown 30529 1726882599.92209: variable 'ansible_shell_executable' from source: unknown 30529 1726882599.92214: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.92216: variable 'ansible_pipelining' from source: unknown 30529 1726882599.92220: variable 'ansible_timeout' from source: unknown 30529 1726882599.92224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.92476: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882599.92486: variable 'omit' from source: magic vars 30529 1726882599.92498: starting attempt loop 30529 1726882599.92501: running the handler 30529 1726882599.92705: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30529 1726882599.92709: Evaluated conditional (lsr_net_profile_ansible_managed): True 30529 1726882599.92716: handler run complete 30529 1726882599.92731: attempt loop complete, returning result 30529 1726882599.92734: _execute() done 30529 1726882599.92736: dumping result to json 30529 1726882599.92739: done dumping result, returning 30529 1726882599.92746: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' [12673a56-9f93-b0f1-edc0-000000000385] 30529 1726882599.92755: sending task result for task 12673a56-9f93-b0f1-edc0-000000000385 30529 1726882599.92990: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000385 30529 1726882599.92997: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882599.93048: no more pending results, returning what we have 30529 1726882599.93051: results queue empty 30529 1726882599.93052: checking for any_errors_fatal 30529 1726882599.93059: done checking for any_errors_fatal 30529 1726882599.93060: checking for max_fail_percentage 30529 1726882599.93062: done checking for max_fail_percentage 30529 1726882599.93064: checking to see if all hosts have failed and the running result is not ok 30529 1726882599.93065: done checking to see if all hosts have failed 30529 1726882599.93066: getting the remaining hosts for this loop 30529 1726882599.93068: done getting the remaining hosts for this loop 30529 1726882599.93071: getting the next task for host managed_node1 30529 1726882599.93080: done getting next task for host managed_node1 30529 1726882599.93082: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30529 1726882599.93089: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882599.93096: getting variables 30529 1726882599.93098: in VariableManager get_vars() 30529 1726882599.93133: Calling all_inventory to load vars for managed_node1 30529 1726882599.93136: Calling groups_inventory to load vars for managed_node1 30529 1726882599.93140: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882599.93152: Calling all_plugins_play to load vars for managed_node1 30529 1726882599.93155: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882599.93158: Calling groups_plugins_play to load vars for managed_node1 30529 1726882599.95649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882599.98434: done with get_vars() 30529 1726882599.98465: done getting variables 30529 1726882599.98537: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882599.98655: variable 'profile' from source: play vars 30529 1726882599.98659: variable 'interface' from source: play vars 30529 1726882599.98728: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:36:39 -0400 (0:00:00.101) 0:00:14.014 ****** 30529 1726882599.98796: entering _queue_task() for managed_node1/assert 30529 1726882599.99144: worker is 1 (out of 1 available) 30529 1726882599.99158: exiting _queue_task() for managed_node1/assert 30529 1726882599.99171: done queuing things up, now waiting for results queue to drain 30529 1726882599.99173: waiting for pending results... 30529 1726882599.99466: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr 30529 1726882599.99606: in run() - task 12673a56-9f93-b0f1-edc0-000000000386 30529 1726882599.99611: variable 'ansible_search_path' from source: unknown 30529 1726882599.99618: variable 'ansible_search_path' from source: unknown 30529 1726882599.99656: calling self._execute() 30529 1726882599.99799: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882599.99802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882599.99805: variable 'omit' from source: magic vars 30529 1726882600.00169: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.00186: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.00200: variable 'omit' from source: magic vars 30529 1726882600.00255: variable 'omit' from source: magic vars 30529 1726882600.00340: variable 'profile' from source: play vars 30529 1726882600.00364: variable 'interface' from source: play vars 30529 1726882600.00418: variable 'interface' from source: play vars 30529 1726882600.00473: variable 'omit' from source: magic vars 30529 1726882600.00488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882600.00527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882600.00548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882600.00566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.00590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.00691: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882600.00695: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.00698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.00736: Set connection var ansible_shell_executable to /bin/sh 30529 1726882600.00745: Set connection var ansible_pipelining to False 30529 1726882600.00751: Set connection var ansible_shell_type to sh 30529 1726882600.00762: Set connection var ansible_timeout to 10 30529 1726882600.00767: Set connection var ansible_connection to ssh 30529 1726882600.00775: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882600.00807: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.00815: variable 'ansible_connection' from source: unknown 30529 1726882600.00821: variable 'ansible_module_compression' from source: unknown 30529 1726882600.00826: variable 'ansible_shell_type' from source: unknown 30529 1726882600.00831: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.00836: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.00843: variable 'ansible_pipelining' from source: unknown 30529 1726882600.00849: variable 'ansible_timeout' from source: unknown 30529 1726882600.00855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.00984: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882600.01000: variable 'omit' from source: magic vars 30529 1726882600.01126: starting attempt loop 30529 1726882600.01129: running the handler 30529 1726882600.01140: variable 'lsr_net_profile_fingerprint' from source: set_fact 30529 1726882600.01146: Evaluated conditional (lsr_net_profile_fingerprint): True 30529 1726882600.01155: handler run complete 30529 1726882600.01174: attempt loop complete, returning result 30529 1726882600.01181: _execute() done 30529 1726882600.01187: dumping result to json 30529 1726882600.01197: done dumping result, returning 30529 1726882600.01210: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr [12673a56-9f93-b0f1-edc0-000000000386] 30529 1726882600.01221: sending task result for task 12673a56-9f93-b0f1-edc0-000000000386 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882600.01364: no more pending results, returning what we have 30529 1726882600.01367: results queue empty 30529 1726882600.01368: checking for any_errors_fatal 30529 1726882600.01375: done checking for any_errors_fatal 30529 1726882600.01375: checking for max_fail_percentage 30529 1726882600.01377: done checking for max_fail_percentage 30529 1726882600.01378: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.01379: done checking to see if all hosts have failed 30529 1726882600.01380: getting the remaining hosts for this loop 30529 1726882600.01382: done getting the remaining hosts for this loop 30529 1726882600.01388: getting the next task for host managed_node1 30529 1726882600.01398: done getting next task for host managed_node1 30529 1726882600.01402: ^ task is: TASK: Conditional asserts 30529 1726882600.01404: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.01410: getting variables 30529 1726882600.01411: in VariableManager get_vars() 30529 1726882600.01441: Calling all_inventory to load vars for managed_node1 30529 1726882600.01443: Calling groups_inventory to load vars for managed_node1 30529 1726882600.01446: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.01459: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.01462: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.01465: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.02006: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000386 30529 1726882600.02010: WORKER PROCESS EXITING 30529 1726882600.03330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.05106: done with get_vars() 30529 1726882600.05130: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:36:40 -0400 (0:00:00.064) 0:00:14.078 ****** 30529 1726882600.05236: entering _queue_task() for managed_node1/include_tasks 30529 1726882600.05628: worker is 1 (out of 1 available) 30529 1726882600.05641: exiting _queue_task() for managed_node1/include_tasks 30529 1726882600.05654: done queuing things up, now waiting for results queue to drain 30529 1726882600.05656: waiting for pending results... 30529 1726882600.05888: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882600.05985: in run() - task 12673a56-9f93-b0f1-edc0-000000000097 30529 1726882600.06014: variable 'ansible_search_path' from source: unknown 30529 1726882600.06022: variable 'ansible_search_path' from source: unknown 30529 1726882600.06305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882600.08507: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882600.08791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882600.08797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882600.08861: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882600.08887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882600.09097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882600.09400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882600.09403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882600.09405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882600.09407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882600.09549: variable 'lsr_assert_when' from source: include params 30529 1726882600.09766: variable 'network_provider' from source: set_fact 30529 1726882600.09944: variable 'omit' from source: magic vars 30529 1726882600.10162: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.10171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.10180: variable 'omit' from source: magic vars 30529 1726882600.10633: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.10642: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.10843: variable 'item' from source: unknown 30529 1726882600.10849: Evaluated conditional (item['condition']): True 30529 1726882600.11074: variable 'item' from source: unknown 30529 1726882600.11109: variable 'item' from source: unknown 30529 1726882600.11227: variable 'item' from source: unknown 30529 1726882600.11482: dumping result to json 30529 1726882600.11485: done dumping result, returning 30529 1726882600.11487: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-000000000097] 30529 1726882600.11489: sending task result for task 12673a56-9f93-b0f1-edc0-000000000097 30529 1726882600.11527: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000097 30529 1726882600.11529: WORKER PROCESS EXITING 30529 1726882600.11608: no more pending results, returning what we have 30529 1726882600.11613: in VariableManager get_vars() 30529 1726882600.11646: Calling all_inventory to load vars for managed_node1 30529 1726882600.11648: Calling groups_inventory to load vars for managed_node1 30529 1726882600.11651: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.11662: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.11664: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.11667: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.13171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.15736: done with get_vars() 30529 1726882600.15757: variable 'ansible_search_path' from source: unknown 30529 1726882600.15759: variable 'ansible_search_path' from source: unknown 30529 1726882600.15906: we have included files to process 30529 1726882600.15908: generating all_blocks data 30529 1726882600.15910: done generating all_blocks data 30529 1726882600.15916: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882600.15917: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882600.15920: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882600.16080: in VariableManager get_vars() 30529 1726882600.16368: done with get_vars() 30529 1726882600.16481: done processing included file 30529 1726882600.16483: iterating over new_blocks loaded from include file 30529 1726882600.16484: in VariableManager get_vars() 30529 1726882600.16595: done with get_vars() 30529 1726882600.16598: filtering new block on tags 30529 1726882600.16634: done filtering new block on tags 30529 1726882600.16637: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 30529 1726882600.16642: extending task lists for all hosts with included blocks 30529 1726882600.19251: done extending task lists 30529 1726882600.19253: done processing included files 30529 1726882600.19254: results queue empty 30529 1726882600.19255: checking for any_errors_fatal 30529 1726882600.19259: done checking for any_errors_fatal 30529 1726882600.19260: checking for max_fail_percentage 30529 1726882600.19261: done checking for max_fail_percentage 30529 1726882600.19262: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.19263: done checking to see if all hosts have failed 30529 1726882600.19264: getting the remaining hosts for this loop 30529 1726882600.19265: done getting the remaining hosts for this loop 30529 1726882600.19268: getting the next task for host managed_node1 30529 1726882600.19273: done getting next task for host managed_node1 30529 1726882600.19280: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882600.19284: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.19297: getting variables 30529 1726882600.19298: in VariableManager get_vars() 30529 1726882600.19307: Calling all_inventory to load vars for managed_node1 30529 1726882600.19309: Calling groups_inventory to load vars for managed_node1 30529 1726882600.19312: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.19318: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.19320: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.19328: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.20691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.22511: done with get_vars() 30529 1726882600.22531: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:36:40 -0400 (0:00:00.173) 0:00:14.252 ****** 30529 1726882600.22607: entering _queue_task() for managed_node1/include_tasks 30529 1726882600.22918: worker is 1 (out of 1 available) 30529 1726882600.22930: exiting _queue_task() for managed_node1/include_tasks 30529 1726882600.22941: done queuing things up, now waiting for results queue to drain 30529 1726882600.22943: waiting for pending results... 30529 1726882600.23322: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882600.23334: in run() - task 12673a56-9f93-b0f1-edc0-000000000452 30529 1726882600.23348: variable 'ansible_search_path' from source: unknown 30529 1726882600.23351: variable 'ansible_search_path' from source: unknown 30529 1726882600.23387: calling self._execute() 30529 1726882600.23474: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.23477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.23526: variable 'omit' from source: magic vars 30529 1726882600.23851: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.23860: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.23866: _execute() done 30529 1726882600.23870: dumping result to json 30529 1726882600.23873: done dumping result, returning 30529 1726882600.23880: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-000000000452] 30529 1726882600.23885: sending task result for task 12673a56-9f93-b0f1-edc0-000000000452 30529 1726882600.24018: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000452 30529 1726882600.24021: WORKER PROCESS EXITING 30529 1726882600.24085: no more pending results, returning what we have 30529 1726882600.24089: in VariableManager get_vars() 30529 1726882600.24119: Calling all_inventory to load vars for managed_node1 30529 1726882600.24122: Calling groups_inventory to load vars for managed_node1 30529 1726882600.24124: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.24133: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.24135: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.24138: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.25331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.26919: done with get_vars() 30529 1726882600.26938: variable 'ansible_search_path' from source: unknown 30529 1726882600.26939: variable 'ansible_search_path' from source: unknown 30529 1726882600.27071: variable 'item' from source: include params 30529 1726882600.27108: we have included files to process 30529 1726882600.27110: generating all_blocks data 30529 1726882600.27112: done generating all_blocks data 30529 1726882600.27113: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882600.27114: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882600.27116: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882600.27291: done processing included file 30529 1726882600.27295: iterating over new_blocks loaded from include file 30529 1726882600.27297: in VariableManager get_vars() 30529 1726882600.27313: done with get_vars() 30529 1726882600.27315: filtering new block on tags 30529 1726882600.27341: done filtering new block on tags 30529 1726882600.27344: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882600.27349: extending task lists for all hosts with included blocks 30529 1726882600.27508: done extending task lists 30529 1726882600.27509: done processing included files 30529 1726882600.27510: results queue empty 30529 1726882600.27511: checking for any_errors_fatal 30529 1726882600.27515: done checking for any_errors_fatal 30529 1726882600.27516: checking for max_fail_percentage 30529 1726882600.27517: done checking for max_fail_percentage 30529 1726882600.27518: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.27519: done checking to see if all hosts have failed 30529 1726882600.27520: getting the remaining hosts for this loop 30529 1726882600.27521: done getting the remaining hosts for this loop 30529 1726882600.27524: getting the next task for host managed_node1 30529 1726882600.27529: done getting next task for host managed_node1 30529 1726882600.27531: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882600.27534: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.27537: getting variables 30529 1726882600.27538: in VariableManager get_vars() 30529 1726882600.27547: Calling all_inventory to load vars for managed_node1 30529 1726882600.27549: Calling groups_inventory to load vars for managed_node1 30529 1726882600.27551: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.27557: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.27559: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.27562: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.28725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.30060: done with get_vars() 30529 1726882600.30081: done getting variables 30529 1726882600.30200: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:40 -0400 (0:00:00.076) 0:00:14.328 ****** 30529 1726882600.30229: entering _queue_task() for managed_node1/stat 30529 1726882600.30567: worker is 1 (out of 1 available) 30529 1726882600.30582: exiting _queue_task() for managed_node1/stat 30529 1726882600.30596: done queuing things up, now waiting for results queue to drain 30529 1726882600.30598: waiting for pending results... 30529 1726882600.31012: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882600.31025: in run() - task 12673a56-9f93-b0f1-edc0-0000000004e8 30529 1726882600.31046: variable 'ansible_search_path' from source: unknown 30529 1726882600.31053: variable 'ansible_search_path' from source: unknown 30529 1726882600.31095: calling self._execute() 30529 1726882600.31186: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.31201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.31218: variable 'omit' from source: magic vars 30529 1726882600.31582: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.31675: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.31678: variable 'omit' from source: magic vars 30529 1726882600.31679: variable 'omit' from source: magic vars 30529 1726882600.31778: variable 'interface' from source: play vars 30529 1726882600.31808: variable 'omit' from source: magic vars 30529 1726882600.31858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882600.32001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882600.32004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882600.32007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.32009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.32011: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882600.32018: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.32027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.32143: Set connection var ansible_shell_executable to /bin/sh 30529 1726882600.32155: Set connection var ansible_pipelining to False 30529 1726882600.32163: Set connection var ansible_shell_type to sh 30529 1726882600.32177: Set connection var ansible_timeout to 10 30529 1726882600.32183: Set connection var ansible_connection to ssh 30529 1726882600.32196: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882600.32229: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.32238: variable 'ansible_connection' from source: unknown 30529 1726882600.32246: variable 'ansible_module_compression' from source: unknown 30529 1726882600.32252: variable 'ansible_shell_type' from source: unknown 30529 1726882600.32258: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.32265: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.32272: variable 'ansible_pipelining' from source: unknown 30529 1726882600.32279: variable 'ansible_timeout' from source: unknown 30529 1726882600.32287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.32513: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882600.32546: variable 'omit' from source: magic vars 30529 1726882600.32549: starting attempt loop 30529 1726882600.32554: running the handler 30529 1726882600.32655: _low_level_execute_command(): starting 30529 1726882600.32659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882600.33415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.33431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.33520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.35197: stdout chunk (state=3): >>>/root <<< 30529 1726882600.35345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.35349: stdout chunk (state=3): >>><<< 30529 1726882600.35351: stderr chunk (state=3): >>><<< 30529 1726882600.35459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.35483: _low_level_execute_command(): starting 30529 1726882600.35485: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576 `" && echo ansible-tmp-1726882600.3537393-31188-50878003081576="` echo /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576 `" ) && sleep 0' 30529 1726882600.36009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.36043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882600.36053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.36324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.36371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.38235: stdout chunk (state=3): >>>ansible-tmp-1726882600.3537393-31188-50878003081576=/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576 <<< 30529 1726882600.38398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.38402: stderr chunk (state=3): >>><<< 30529 1726882600.38405: stdout chunk (state=3): >>><<< 30529 1726882600.38407: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882600.3537393-31188-50878003081576=/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.38453: variable 'ansible_module_compression' from source: unknown 30529 1726882600.38510: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882600.38541: variable 'ansible_facts' from source: unknown 30529 1726882600.38635: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py 30529 1726882600.38851: Sending initial data 30529 1726882600.38854: Sent initial data (152 bytes) 30529 1726882600.39399: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882600.39402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882600.39405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.39407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882600.39459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.39491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.39562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.41112: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882600.41136: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882600.41179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882600.41234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpp5j_mv33 /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py <<< 30529 1726882600.41238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py" <<< 30529 1726882600.41273: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpp5j_mv33" to remote "/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py" <<< 30529 1726882600.42010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.42023: stderr chunk (state=3): >>><<< 30529 1726882600.42043: stdout chunk (state=3): >>><<< 30529 1726882600.42152: done transferring module to remote 30529 1726882600.42155: _low_level_execute_command(): starting 30529 1726882600.42158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/ /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py && sleep 0' 30529 1726882600.42736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882600.42770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882600.42786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.42909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882600.42953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.42988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.43063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.44811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.44815: stdout chunk (state=3): >>><<< 30529 1726882600.44817: stderr chunk (state=3): >>><<< 30529 1726882600.44899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.44903: _low_level_execute_command(): starting 30529 1726882600.44905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/AnsiballZ_stat.py && sleep 0' 30529 1726882600.45514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882600.45529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882600.45546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.45618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.45686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882600.45711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.45726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.45885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.60830: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30576, "dev": 23, "nlink": 1, "atime": 1726882597.2238646, "mtime": 1726882597.2238646, "ctime": 1726882597.2238646, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882600.61999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882600.62024: stderr chunk (state=3): >>><<< 30529 1726882600.62029: stdout chunk (state=3): >>><<< 30529 1726882600.62045: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30576, "dev": 23, "nlink": 1, "atime": 1726882597.2238646, "mtime": 1726882597.2238646, "ctime": 1726882597.2238646, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882600.62082: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882600.62092: _low_level_execute_command(): starting 30529 1726882600.62098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882600.3537393-31188-50878003081576/ > /dev/null 2>&1 && sleep 0' 30529 1726882600.62524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.62527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882600.62536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.62539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.62541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.62602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.62655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.62714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.64474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.64496: stderr chunk (state=3): >>><<< 30529 1726882600.64500: stdout chunk (state=3): >>><<< 30529 1726882600.64515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.64519: handler run complete 30529 1726882600.64557: attempt loop complete, returning result 30529 1726882600.64560: _execute() done 30529 1726882600.64562: dumping result to json 30529 1726882600.64565: done dumping result, returning 30529 1726882600.64572: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-0000000004e8] 30529 1726882600.64577: sending task result for task 12673a56-9f93-b0f1-edc0-0000000004e8 30529 1726882600.64677: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000004e8 30529 1726882600.64680: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882597.2238646, "block_size": 4096, "blocks": 0, "ctime": 1726882597.2238646, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30576, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882597.2238646, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30529 1726882600.64765: no more pending results, returning what we have 30529 1726882600.64768: results queue empty 30529 1726882600.64769: checking for any_errors_fatal 30529 1726882600.64771: done checking for any_errors_fatal 30529 1726882600.64771: checking for max_fail_percentage 30529 1726882600.64773: done checking for max_fail_percentage 30529 1726882600.64774: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.64775: done checking to see if all hosts have failed 30529 1726882600.64775: getting the remaining hosts for this loop 30529 1726882600.64777: done getting the remaining hosts for this loop 30529 1726882600.64780: getting the next task for host managed_node1 30529 1726882600.64794: done getting next task for host managed_node1 30529 1726882600.64803: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30529 1726882600.64806: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.64811: getting variables 30529 1726882600.64813: in VariableManager get_vars() 30529 1726882600.64841: Calling all_inventory to load vars for managed_node1 30529 1726882600.64844: Calling groups_inventory to load vars for managed_node1 30529 1726882600.64846: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.64856: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.64858: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.64860: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.65733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.66600: done with get_vars() 30529 1726882600.66615: done getting variables 30529 1726882600.66658: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882600.66743: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:36:40 -0400 (0:00:00.365) 0:00:14.693 ****** 30529 1726882600.66767: entering _queue_task() for managed_node1/assert 30529 1726882600.66985: worker is 1 (out of 1 available) 30529 1726882600.67001: exiting _queue_task() for managed_node1/assert 30529 1726882600.67012: done queuing things up, now waiting for results queue to drain 30529 1726882600.67014: waiting for pending results... 30529 1726882600.67184: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' 30529 1726882600.67259: in run() - task 12673a56-9f93-b0f1-edc0-000000000453 30529 1726882600.67271: variable 'ansible_search_path' from source: unknown 30529 1726882600.67274: variable 'ansible_search_path' from source: unknown 30529 1726882600.67303: calling self._execute() 30529 1726882600.67370: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.67375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.67382: variable 'omit' from source: magic vars 30529 1726882600.67637: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.67646: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.67651: variable 'omit' from source: magic vars 30529 1726882600.67682: variable 'omit' from source: magic vars 30529 1726882600.67752: variable 'interface' from source: play vars 30529 1726882600.67764: variable 'omit' from source: magic vars 30529 1726882600.67799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882600.67826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882600.67841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882600.67854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.67866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.67892: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882600.67906: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.67909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.67965: Set connection var ansible_shell_executable to /bin/sh 30529 1726882600.67969: Set connection var ansible_pipelining to False 30529 1726882600.67971: Set connection var ansible_shell_type to sh 30529 1726882600.67979: Set connection var ansible_timeout to 10 30529 1726882600.67982: Set connection var ansible_connection to ssh 30529 1726882600.67989: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882600.68010: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.68014: variable 'ansible_connection' from source: unknown 30529 1726882600.68017: variable 'ansible_module_compression' from source: unknown 30529 1726882600.68019: variable 'ansible_shell_type' from source: unknown 30529 1726882600.68022: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.68024: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.68026: variable 'ansible_pipelining' from source: unknown 30529 1726882600.68028: variable 'ansible_timeout' from source: unknown 30529 1726882600.68030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.68126: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882600.68134: variable 'omit' from source: magic vars 30529 1726882600.68141: starting attempt loop 30529 1726882600.68144: running the handler 30529 1726882600.68232: variable 'interface_stat' from source: set_fact 30529 1726882600.68245: Evaluated conditional (interface_stat.stat.exists): True 30529 1726882600.68250: handler run complete 30529 1726882600.68263: attempt loop complete, returning result 30529 1726882600.68266: _execute() done 30529 1726882600.68269: dumping result to json 30529 1726882600.68271: done dumping result, returning 30529 1726882600.68277: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' [12673a56-9f93-b0f1-edc0-000000000453] 30529 1726882600.68283: sending task result for task 12673a56-9f93-b0f1-edc0-000000000453 30529 1726882600.68363: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000453 30529 1726882600.68366: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882600.68419: no more pending results, returning what we have 30529 1726882600.68422: results queue empty 30529 1726882600.68423: checking for any_errors_fatal 30529 1726882600.68430: done checking for any_errors_fatal 30529 1726882600.68430: checking for max_fail_percentage 30529 1726882600.68432: done checking for max_fail_percentage 30529 1726882600.68433: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.68433: done checking to see if all hosts have failed 30529 1726882600.68434: getting the remaining hosts for this loop 30529 1726882600.68436: done getting the remaining hosts for this loop 30529 1726882600.68439: getting the next task for host managed_node1 30529 1726882600.68446: done getting next task for host managed_node1 30529 1726882600.68449: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882600.68452: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.68455: getting variables 30529 1726882600.68456: in VariableManager get_vars() 30529 1726882600.68492: Calling all_inventory to load vars for managed_node1 30529 1726882600.68496: Calling groups_inventory to load vars for managed_node1 30529 1726882600.68499: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.68507: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.68510: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.68513: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.69240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.70079: done with get_vars() 30529 1726882600.70095: done getting variables 30529 1726882600.70135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882600.70211: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:36:40 -0400 (0:00:00.034) 0:00:14.728 ****** 30529 1726882600.70233: entering _queue_task() for managed_node1/debug 30529 1726882600.70428: worker is 1 (out of 1 available) 30529 1726882600.70440: exiting _queue_task() for managed_node1/debug 30529 1726882600.70452: done queuing things up, now waiting for results queue to drain 30529 1726882600.70454: waiting for pending results... 30529 1726882600.70646: running TaskExecutor() for managed_node1/TASK: Success in test 'I can create a profile' 30529 1726882600.70717: in run() - task 12673a56-9f93-b0f1-edc0-000000000098 30529 1726882600.70727: variable 'ansible_search_path' from source: unknown 30529 1726882600.70730: variable 'ansible_search_path' from source: unknown 30529 1726882600.70757: calling self._execute() 30529 1726882600.70833: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.70836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.70845: variable 'omit' from source: magic vars 30529 1726882600.71106: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.71119: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.71122: variable 'omit' from source: magic vars 30529 1726882600.71161: variable 'omit' from source: magic vars 30529 1726882600.71229: variable 'lsr_description' from source: include params 30529 1726882600.71245: variable 'omit' from source: magic vars 30529 1726882600.71275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882600.71305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882600.71320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882600.71336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.71346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.71368: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882600.71371: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.71374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.71443: Set connection var ansible_shell_executable to /bin/sh 30529 1726882600.71448: Set connection var ansible_pipelining to False 30529 1726882600.71451: Set connection var ansible_shell_type to sh 30529 1726882600.71458: Set connection var ansible_timeout to 10 30529 1726882600.71460: Set connection var ansible_connection to ssh 30529 1726882600.71466: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882600.71488: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.71492: variable 'ansible_connection' from source: unknown 30529 1726882600.71496: variable 'ansible_module_compression' from source: unknown 30529 1726882600.71499: variable 'ansible_shell_type' from source: unknown 30529 1726882600.71501: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.71503: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.71505: variable 'ansible_pipelining' from source: unknown 30529 1726882600.71507: variable 'ansible_timeout' from source: unknown 30529 1726882600.71509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.71612: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882600.71622: variable 'omit' from source: magic vars 30529 1726882600.71627: starting attempt loop 30529 1726882600.71630: running the handler 30529 1726882600.71664: handler run complete 30529 1726882600.71677: attempt loop complete, returning result 30529 1726882600.71680: _execute() done 30529 1726882600.71683: dumping result to json 30529 1726882600.71687: done dumping result, returning 30529 1726882600.71690: done running TaskExecutor() for managed_node1/TASK: Success in test 'I can create a profile' [12673a56-9f93-b0f1-edc0-000000000098] 30529 1726882600.71696: sending task result for task 12673a56-9f93-b0f1-edc0-000000000098 30529 1726882600.71771: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000098 30529 1726882600.71773: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 30529 1726882600.71825: no more pending results, returning what we have 30529 1726882600.71828: results queue empty 30529 1726882600.71829: checking for any_errors_fatal 30529 1726882600.71834: done checking for any_errors_fatal 30529 1726882600.71835: checking for max_fail_percentage 30529 1726882600.71836: done checking for max_fail_percentage 30529 1726882600.71837: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.71838: done checking to see if all hosts have failed 30529 1726882600.71838: getting the remaining hosts for this loop 30529 1726882600.71840: done getting the remaining hosts for this loop 30529 1726882600.71844: getting the next task for host managed_node1 30529 1726882600.71851: done getting next task for host managed_node1 30529 1726882600.71853: ^ task is: TASK: Cleanup 30529 1726882600.71855: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.71862: getting variables 30529 1726882600.71864: in VariableManager get_vars() 30529 1726882600.71890: Calling all_inventory to load vars for managed_node1 30529 1726882600.71894: Calling groups_inventory to load vars for managed_node1 30529 1726882600.71897: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.71945: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.71948: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.71952: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.76813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.78328: done with get_vars() 30529 1726882600.78349: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:36:40 -0400 (0:00:00.081) 0:00:14.810 ****** 30529 1726882600.78431: entering _queue_task() for managed_node1/include_tasks 30529 1726882600.78768: worker is 1 (out of 1 available) 30529 1726882600.78781: exiting _queue_task() for managed_node1/include_tasks 30529 1726882600.78998: done queuing things up, now waiting for results queue to drain 30529 1726882600.79000: waiting for pending results... 30529 1726882600.79130: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882600.79210: in run() - task 12673a56-9f93-b0f1-edc0-00000000009c 30529 1726882600.79235: variable 'ansible_search_path' from source: unknown 30529 1726882600.79335: variable 'ansible_search_path' from source: unknown 30529 1726882600.79340: variable 'lsr_cleanup' from source: include params 30529 1726882600.79521: variable 'lsr_cleanup' from source: include params 30529 1726882600.79596: variable 'omit' from source: magic vars 30529 1726882600.79738: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.79759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.79781: variable 'omit' from source: magic vars 30529 1726882600.80021: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.80036: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.80049: variable 'item' from source: unknown 30529 1726882600.80123: variable 'item' from source: unknown 30529 1726882600.80160: variable 'item' from source: unknown 30529 1726882600.80228: variable 'item' from source: unknown 30529 1726882600.80549: dumping result to json 30529 1726882600.80553: done dumping result, returning 30529 1726882600.80555: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-00000000009c] 30529 1726882600.80557: sending task result for task 12673a56-9f93-b0f1-edc0-00000000009c 30529 1726882600.80604: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000009c 30529 1726882600.80608: WORKER PROCESS EXITING 30529 1726882600.80631: no more pending results, returning what we have 30529 1726882600.80636: in VariableManager get_vars() 30529 1726882600.80669: Calling all_inventory to load vars for managed_node1 30529 1726882600.80672: Calling groups_inventory to load vars for managed_node1 30529 1726882600.80676: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.80694: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.80698: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.80702: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.82113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.83644: done with get_vars() 30529 1726882600.83661: variable 'ansible_search_path' from source: unknown 30529 1726882600.83662: variable 'ansible_search_path' from source: unknown 30529 1726882600.83704: we have included files to process 30529 1726882600.83705: generating all_blocks data 30529 1726882600.83707: done generating all_blocks data 30529 1726882600.83711: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882600.83712: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882600.83715: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882600.83953: done processing included file 30529 1726882600.83955: iterating over new_blocks loaded from include file 30529 1726882600.83956: in VariableManager get_vars() 30529 1726882600.83970: done with get_vars() 30529 1726882600.83972: filtering new block on tags 30529 1726882600.84003: done filtering new block on tags 30529 1726882600.84006: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882600.84010: extending task lists for all hosts with included blocks 30529 1726882600.85443: done extending task lists 30529 1726882600.85444: done processing included files 30529 1726882600.85445: results queue empty 30529 1726882600.85446: checking for any_errors_fatal 30529 1726882600.85451: done checking for any_errors_fatal 30529 1726882600.85451: checking for max_fail_percentage 30529 1726882600.85453: done checking for max_fail_percentage 30529 1726882600.85453: checking to see if all hosts have failed and the running result is not ok 30529 1726882600.85454: done checking to see if all hosts have failed 30529 1726882600.85455: getting the remaining hosts for this loop 30529 1726882600.85456: done getting the remaining hosts for this loop 30529 1726882600.85459: getting the next task for host managed_node1 30529 1726882600.85464: done getting next task for host managed_node1 30529 1726882600.85466: ^ task is: TASK: Cleanup profile and device 30529 1726882600.85468: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882600.85471: getting variables 30529 1726882600.85472: in VariableManager get_vars() 30529 1726882600.85481: Calling all_inventory to load vars for managed_node1 30529 1726882600.85483: Calling groups_inventory to load vars for managed_node1 30529 1726882600.85488: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882600.85495: Calling all_plugins_play to load vars for managed_node1 30529 1726882600.85498: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882600.85501: Calling groups_plugins_play to load vars for managed_node1 30529 1726882600.86734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882600.88250: done with get_vars() 30529 1726882600.88270: done getting variables 30529 1726882600.88318: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:36:40 -0400 (0:00:00.099) 0:00:14.909 ****** 30529 1726882600.88348: entering _queue_task() for managed_node1/shell 30529 1726882600.88707: worker is 1 (out of 1 available) 30529 1726882600.88721: exiting _queue_task() for managed_node1/shell 30529 1726882600.88733: done queuing things up, now waiting for results queue to drain 30529 1726882600.88735: waiting for pending results... 30529 1726882600.89116: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882600.89153: in run() - task 12673a56-9f93-b0f1-edc0-00000000050b 30529 1726882600.89172: variable 'ansible_search_path' from source: unknown 30529 1726882600.89180: variable 'ansible_search_path' from source: unknown 30529 1726882600.89227: calling self._execute() 30529 1726882600.89323: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.89335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.89349: variable 'omit' from source: magic vars 30529 1726882600.89724: variable 'ansible_distribution_major_version' from source: facts 30529 1726882600.89742: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882600.89759: variable 'omit' from source: magic vars 30529 1726882600.89815: variable 'omit' from source: magic vars 30529 1726882600.89973: variable 'interface' from source: play vars 30529 1726882600.90002: variable 'omit' from source: magic vars 30529 1726882600.90045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882600.90096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882600.90123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882600.90191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.90196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882600.90202: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882600.90211: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.90219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.90329: Set connection var ansible_shell_executable to /bin/sh 30529 1726882600.90341: Set connection var ansible_pipelining to False 30529 1726882600.90349: Set connection var ansible_shell_type to sh 30529 1726882600.90364: Set connection var ansible_timeout to 10 30529 1726882600.90399: Set connection var ansible_connection to ssh 30529 1726882600.90406: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882600.90409: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.90413: variable 'ansible_connection' from source: unknown 30529 1726882600.90415: variable 'ansible_module_compression' from source: unknown 30529 1726882600.90417: variable 'ansible_shell_type' from source: unknown 30529 1726882600.90420: variable 'ansible_shell_executable' from source: unknown 30529 1726882600.90445: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882600.90449: variable 'ansible_pipelining' from source: unknown 30529 1726882600.90451: variable 'ansible_timeout' from source: unknown 30529 1726882600.90453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882600.90662: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882600.90666: variable 'omit' from source: magic vars 30529 1726882600.90668: starting attempt loop 30529 1726882600.90671: running the handler 30529 1726882600.90673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882600.90675: _low_level_execute_command(): starting 30529 1726882600.90677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882600.91346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882600.91357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882600.91368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882600.91474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.91504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.91575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.93252: stdout chunk (state=3): >>>/root <<< 30529 1726882600.93428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.93432: stdout chunk (state=3): >>><<< 30529 1726882600.93434: stderr chunk (state=3): >>><<< 30529 1726882600.93591: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.93597: _low_level_execute_command(): starting 30529 1726882600.93600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765 `" && echo ansible-tmp-1726882600.9350266-31217-194409462654765="` echo /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765 `" ) && sleep 0' 30529 1726882600.94347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.94359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.94572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.94576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.96440: stdout chunk (state=3): >>>ansible-tmp-1726882600.9350266-31217-194409462654765=/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765 <<< 30529 1726882600.96599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882600.96602: stdout chunk (state=3): >>><<< 30529 1726882600.96611: stderr chunk (state=3): >>><<< 30529 1726882600.96798: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882600.9350266-31217-194409462654765=/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882600.96802: variable 'ansible_module_compression' from source: unknown 30529 1726882600.96805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882600.96807: variable 'ansible_facts' from source: unknown 30529 1726882600.96857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py 30529 1726882600.97051: Sending initial data 30529 1726882600.97061: Sent initial data (156 bytes) 30529 1726882600.97680: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882600.97710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882600.97814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882600.97831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882600.97848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882600.97869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882600.98041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882600.99563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882600.99590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882600.99651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_9ozso7p /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py <<< 30529 1726882600.99843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py" <<< 30529 1726882600.99878: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_9ozso7p" to remote "/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py" <<< 30529 1726882601.01109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882601.01118: stdout chunk (state=3): >>><<< 30529 1726882601.01128: stderr chunk (state=3): >>><<< 30529 1726882601.01303: done transferring module to remote 30529 1726882601.01381: _low_level_execute_command(): starting 30529 1726882601.01384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/ /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py && sleep 0' 30529 1726882601.02430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882601.02458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882601.02471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882601.02485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882601.02508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882601.02606: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882601.02666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882601.02809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882601.02844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882601.04846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882601.04849: stdout chunk (state=3): >>><<< 30529 1726882601.04852: stderr chunk (state=3): >>><<< 30529 1726882601.04855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882601.04858: _low_level_execute_command(): starting 30529 1726882601.04862: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/AnsiballZ_command.py && sleep 0' 30529 1726882601.06047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882601.06075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882601.06096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882601.06239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882601.06302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882601.26876: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (83eb4df0-1051-4e47-8e0d-c7726e739ed9) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:36:41.214032", "end": "2024-09-20 21:36:41.267147", "delta": "0:00:00.053115", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 30529 1726882601.26994: stdout chunk (state=3): >>> <<< 30529 1726882601.28622: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30529 1726882601.28661: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882601.28708: stderr chunk (state=3): >>><<< 30529 1726882601.28711: stdout chunk (state=3): >>><<< 30529 1726882601.29057: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (83eb4df0-1051-4e47-8e0d-c7726e739ed9) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:36:41.214032", "end": "2024-09-20 21:36:41.267147", "delta": "0:00:00.053115", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882601.29061: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882601.29068: _low_level_execute_command(): starting 30529 1726882601.29070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882600.9350266-31217-194409462654765/ > /dev/null 2>&1 && sleep 0' 30529 1726882601.30147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882601.30207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882601.30219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882601.30388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882601.32599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882601.32603: stdout chunk (state=3): >>><<< 30529 1726882601.32605: stderr chunk (state=3): >>><<< 30529 1726882601.32607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882601.32610: handler run complete 30529 1726882601.32612: Evaluated conditional (False): False 30529 1726882601.32613: attempt loop complete, returning result 30529 1726882601.32615: _execute() done 30529 1726882601.32617: dumping result to json 30529 1726882601.32619: done dumping result, returning 30529 1726882601.32621: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-00000000050b] 30529 1726882601.32623: sending task result for task 12673a56-9f93-b0f1-edc0-00000000050b 30529 1726882601.32692: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000050b 30529 1726882601.32697: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.053115", "end": "2024-09-20 21:36:41.267147", "rc": 1, "start": "2024-09-20 21:36:41.214032" } STDOUT: Connection 'statebr' (83eb4df0-1051-4e47-8e0d-c7726e739ed9) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882601.32758: no more pending results, returning what we have 30529 1726882601.32762: results queue empty 30529 1726882601.32762: checking for any_errors_fatal 30529 1726882601.32763: done checking for any_errors_fatal 30529 1726882601.32764: checking for max_fail_percentage 30529 1726882601.32765: done checking for max_fail_percentage 30529 1726882601.32766: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.32767: done checking to see if all hosts have failed 30529 1726882601.32768: getting the remaining hosts for this loop 30529 1726882601.32769: done getting the remaining hosts for this loop 30529 1726882601.32773: getting the next task for host managed_node1 30529 1726882601.32783: done getting next task for host managed_node1 30529 1726882601.32788: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882601.32790: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.32796: getting variables 30529 1726882601.32798: in VariableManager get_vars() 30529 1726882601.32827: Calling all_inventory to load vars for managed_node1 30529 1726882601.32830: Calling groups_inventory to load vars for managed_node1 30529 1726882601.32833: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.32843: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.32845: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.32848: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.35841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.39247: done with get_vars() 30529 1726882601.39268: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 21:36:41 -0400 (0:00:00.510) 0:00:15.419 ****** 30529 1726882601.39366: entering _queue_task() for managed_node1/include_tasks 30529 1726882601.40123: worker is 1 (out of 1 available) 30529 1726882601.40136: exiting _queue_task() for managed_node1/include_tasks 30529 1726882601.40149: done queuing things up, now waiting for results queue to drain 30529 1726882601.40151: waiting for pending results... 30529 1726882601.40706: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882601.40732: in run() - task 12673a56-9f93-b0f1-edc0-00000000000f 30529 1726882601.40744: variable 'ansible_search_path' from source: unknown 30529 1726882601.40807: calling self._execute() 30529 1726882601.41071: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.41075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.41089: variable 'omit' from source: magic vars 30529 1726882601.41849: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.41909: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.41914: _execute() done 30529 1726882601.41917: dumping result to json 30529 1726882601.41919: done dumping result, returning 30529 1726882601.41922: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-00000000000f] 30529 1726882601.41924: sending task result for task 12673a56-9f93-b0f1-edc0-00000000000f 30529 1726882601.41997: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000000f 30529 1726882601.42000: WORKER PROCESS EXITING 30529 1726882601.42035: no more pending results, returning what we have 30529 1726882601.42040: in VariableManager get_vars() 30529 1726882601.42077: Calling all_inventory to load vars for managed_node1 30529 1726882601.42080: Calling groups_inventory to load vars for managed_node1 30529 1726882601.42084: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.42103: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.42107: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.42111: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.43983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.45671: done with get_vars() 30529 1726882601.45701: variable 'ansible_search_path' from source: unknown 30529 1726882601.45716: we have included files to process 30529 1726882601.45718: generating all_blocks data 30529 1726882601.45720: done generating all_blocks data 30529 1726882601.45726: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882601.45727: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882601.45729: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882601.46233: in VariableManager get_vars() 30529 1726882601.46257: done with get_vars() 30529 1726882601.46308: in VariableManager get_vars() 30529 1726882601.46333: done with get_vars() 30529 1726882601.46400: in VariableManager get_vars() 30529 1726882601.46423: done with get_vars() 30529 1726882601.46466: in VariableManager get_vars() 30529 1726882601.46482: done with get_vars() 30529 1726882601.46532: in VariableManager get_vars() 30529 1726882601.46548: done with get_vars() 30529 1726882601.47042: in VariableManager get_vars() 30529 1726882601.47057: done with get_vars() 30529 1726882601.47074: done processing included file 30529 1726882601.47076: iterating over new_blocks loaded from include file 30529 1726882601.47077: in VariableManager get_vars() 30529 1726882601.47091: done with get_vars() 30529 1726882601.47094: filtering new block on tags 30529 1726882601.47232: done filtering new block on tags 30529 1726882601.47235: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882601.47241: extending task lists for all hosts with included blocks 30529 1726882601.47275: done extending task lists 30529 1726882601.47276: done processing included files 30529 1726882601.47277: results queue empty 30529 1726882601.47278: checking for any_errors_fatal 30529 1726882601.47289: done checking for any_errors_fatal 30529 1726882601.47291: checking for max_fail_percentage 30529 1726882601.47292: done checking for max_fail_percentage 30529 1726882601.47294: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.47295: done checking to see if all hosts have failed 30529 1726882601.47296: getting the remaining hosts for this loop 30529 1726882601.47297: done getting the remaining hosts for this loop 30529 1726882601.47300: getting the next task for host managed_node1 30529 1726882601.47304: done getting next task for host managed_node1 30529 1726882601.47306: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882601.47309: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.47311: getting variables 30529 1726882601.47312: in VariableManager get_vars() 30529 1726882601.47321: Calling all_inventory to load vars for managed_node1 30529 1726882601.47323: Calling groups_inventory to load vars for managed_node1 30529 1726882601.47330: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.47336: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.47339: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.47342: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.49481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.51624: done with get_vars() 30529 1726882601.51650: done getting variables 30529 1726882601.51705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882601.51827: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:36:41 -0400 (0:00:00.124) 0:00:15.544 ****** 30529 1726882601.51857: entering _queue_task() for managed_node1/debug 30529 1726882601.52573: worker is 1 (out of 1 available) 30529 1726882601.52588: exiting _queue_task() for managed_node1/debug 30529 1726882601.52601: done queuing things up, now waiting for results queue to drain 30529 1726882601.52602: waiting for pending results... 30529 1726882601.53228: running TaskExecutor() for managed_node1/TASK: TEST: I can create a profile without autoconnect 30529 1726882601.53233: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b4 30529 1726882601.53236: variable 'ansible_search_path' from source: unknown 30529 1726882601.53238: variable 'ansible_search_path' from source: unknown 30529 1726882601.53242: calling self._execute() 30529 1726882601.53320: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.53335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.53350: variable 'omit' from source: magic vars 30529 1726882601.53744: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.53770: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.53781: variable 'omit' from source: magic vars 30529 1726882601.53826: variable 'omit' from source: magic vars 30529 1726882601.53937: variable 'lsr_description' from source: include params 30529 1726882601.53960: variable 'omit' from source: magic vars 30529 1726882601.54018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882601.54059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.54199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882601.54202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.54204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.54206: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.54208: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.54210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.54290: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.54344: Set connection var ansible_pipelining to False 30529 1726882601.54352: Set connection var ansible_shell_type to sh 30529 1726882601.54366: Set connection var ansible_timeout to 10 30529 1726882601.54373: Set connection var ansible_connection to ssh 30529 1726882601.54382: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.54430: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.54439: variable 'ansible_connection' from source: unknown 30529 1726882601.54446: variable 'ansible_module_compression' from source: unknown 30529 1726882601.54453: variable 'ansible_shell_type' from source: unknown 30529 1726882601.54460: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.54467: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.54475: variable 'ansible_pipelining' from source: unknown 30529 1726882601.54481: variable 'ansible_timeout' from source: unknown 30529 1726882601.54492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.54658: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.54699: variable 'omit' from source: magic vars 30529 1726882601.54702: starting attempt loop 30529 1726882601.54705: running the handler 30529 1726882601.54750: handler run complete 30529 1726882601.54768: attempt loop complete, returning result 30529 1726882601.54774: _execute() done 30529 1726882601.54853: dumping result to json 30529 1726882601.54856: done dumping result, returning 30529 1726882601.54858: done running TaskExecutor() for managed_node1/TASK: TEST: I can create a profile without autoconnect [12673a56-9f93-b0f1-edc0-0000000005b4] 30529 1726882601.54864: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b4 30529 1726882601.54931: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b4 30529 1726882601.54934: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I can create a profile without autoconnect ########## 30529 1726882601.55009: no more pending results, returning what we have 30529 1726882601.55014: results queue empty 30529 1726882601.55015: checking for any_errors_fatal 30529 1726882601.55016: done checking for any_errors_fatal 30529 1726882601.55017: checking for max_fail_percentage 30529 1726882601.55019: done checking for max_fail_percentage 30529 1726882601.55020: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.55021: done checking to see if all hosts have failed 30529 1726882601.55021: getting the remaining hosts for this loop 30529 1726882601.55023: done getting the remaining hosts for this loop 30529 1726882601.55028: getting the next task for host managed_node1 30529 1726882601.55035: done getting next task for host managed_node1 30529 1726882601.55038: ^ task is: TASK: Show item 30529 1726882601.55042: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.55046: getting variables 30529 1726882601.55048: in VariableManager get_vars() 30529 1726882601.55316: Calling all_inventory to load vars for managed_node1 30529 1726882601.55319: Calling groups_inventory to load vars for managed_node1 30529 1726882601.55322: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.55332: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.55334: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.55337: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.56821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.60939: done with get_vars() 30529 1726882601.60961: done getting variables 30529 1726882601.61138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:36:41 -0400 (0:00:00.093) 0:00:15.637 ****** 30529 1726882601.61169: entering _queue_task() for managed_node1/debug 30529 1726882601.61849: worker is 1 (out of 1 available) 30529 1726882601.62091: exiting _queue_task() for managed_node1/debug 30529 1726882601.62105: done queuing things up, now waiting for results queue to drain 30529 1726882601.62106: waiting for pending results... 30529 1726882601.62735: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882601.62740: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b5 30529 1726882601.62804: variable 'ansible_search_path' from source: unknown 30529 1726882601.62814: variable 'ansible_search_path' from source: unknown 30529 1726882601.63199: variable 'omit' from source: magic vars 30529 1726882601.63426: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.63430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.63432: variable 'omit' from source: magic vars 30529 1726882601.64224: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.64241: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.64253: variable 'omit' from source: magic vars 30529 1726882601.64499: variable 'omit' from source: magic vars 30529 1726882601.64504: variable 'item' from source: unknown 30529 1726882601.64798: variable 'item' from source: unknown 30529 1726882601.64801: variable 'omit' from source: magic vars 30529 1726882601.64803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882601.64806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.64824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882601.64918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.64939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.65165: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.65169: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.65171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.65238: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.65398: Set connection var ansible_pipelining to False 30529 1726882601.65495: Set connection var ansible_shell_type to sh 30529 1726882601.65498: Set connection var ansible_timeout to 10 30529 1726882601.65500: Set connection var ansible_connection to ssh 30529 1726882601.65502: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.65505: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.65506: variable 'ansible_connection' from source: unknown 30529 1726882601.65508: variable 'ansible_module_compression' from source: unknown 30529 1726882601.65510: variable 'ansible_shell_type' from source: unknown 30529 1726882601.65511: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.65513: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.65514: variable 'ansible_pipelining' from source: unknown 30529 1726882601.65516: variable 'ansible_timeout' from source: unknown 30529 1726882601.65518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.65929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.65933: variable 'omit' from source: magic vars 30529 1726882601.65936: starting attempt loop 30529 1726882601.65938: running the handler 30529 1726882601.65941: variable 'lsr_description' from source: include params 30529 1726882601.66072: variable 'lsr_description' from source: include params 30529 1726882601.66157: handler run complete 30529 1726882601.66180: attempt loop complete, returning result 30529 1726882601.66206: variable 'item' from source: unknown 30529 1726882601.66385: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 30529 1726882601.66826: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.66829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.66831: variable 'omit' from source: magic vars 30529 1726882601.66975: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.67052: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.67063: variable 'omit' from source: magic vars 30529 1726882601.67081: variable 'omit' from source: magic vars 30529 1726882601.67246: variable 'item' from source: unknown 30529 1726882601.67402: variable 'item' from source: unknown 30529 1726882601.67423: variable 'omit' from source: magic vars 30529 1726882601.67501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.67524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.67549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.67572: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.67802: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.67805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.67807: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.67809: Set connection var ansible_pipelining to False 30529 1726882601.67811: Set connection var ansible_shell_type to sh 30529 1726882601.67910: Set connection var ansible_timeout to 10 30529 1726882601.68018: Set connection var ansible_connection to ssh 30529 1726882601.68021: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.68024: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.68026: variable 'ansible_connection' from source: unknown 30529 1726882601.68030: variable 'ansible_module_compression' from source: unknown 30529 1726882601.68031: variable 'ansible_shell_type' from source: unknown 30529 1726882601.68033: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.68034: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.68036: variable 'ansible_pipelining' from source: unknown 30529 1726882601.68038: variable 'ansible_timeout' from source: unknown 30529 1726882601.68039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.68181: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.68248: variable 'omit' from source: magic vars 30529 1726882601.68259: starting attempt loop 30529 1726882601.68298: running the handler 30529 1726882601.68301: variable 'lsr_setup' from source: include params 30529 1726882601.68562: variable 'lsr_setup' from source: include params 30529 1726882601.68574: handler run complete 30529 1726882601.68601: attempt loop complete, returning result 30529 1726882601.68620: variable 'item' from source: unknown 30529 1726882601.68740: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30529 1726882601.69498: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.69502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.69504: variable 'omit' from source: magic vars 30529 1726882601.69558: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.69571: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.69658: variable 'omit' from source: magic vars 30529 1726882601.69678: variable 'omit' from source: magic vars 30529 1726882601.69729: variable 'item' from source: unknown 30529 1726882601.69869: variable 'item' from source: unknown 30529 1726882601.70091: variable 'omit' from source: magic vars 30529 1726882601.70097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.70099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.70101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.70103: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.70105: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.70107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.70233: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.70244: Set connection var ansible_pipelining to False 30529 1726882601.70250: Set connection var ansible_shell_type to sh 30529 1726882601.70264: Set connection var ansible_timeout to 10 30529 1726882601.70270: Set connection var ansible_connection to ssh 30529 1726882601.70279: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.70324: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.70417: variable 'ansible_connection' from source: unknown 30529 1726882601.70518: variable 'ansible_module_compression' from source: unknown 30529 1726882601.70521: variable 'ansible_shell_type' from source: unknown 30529 1726882601.70523: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.70526: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.70529: variable 'ansible_pipelining' from source: unknown 30529 1726882601.70530: variable 'ansible_timeout' from source: unknown 30529 1726882601.70532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.70653: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.70665: variable 'omit' from source: magic vars 30529 1726882601.70673: starting attempt loop 30529 1726882601.70679: running the handler 30529 1726882601.70707: variable 'lsr_test' from source: include params 30529 1726882601.70975: variable 'lsr_test' from source: include params 30529 1726882601.70978: handler run complete 30529 1726882601.70981: attempt loop complete, returning result 30529 1726882601.70983: variable 'item' from source: unknown 30529 1726882601.71059: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 30529 1726882601.71498: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.71502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.71504: variable 'omit' from source: magic vars 30529 1726882601.71846: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.71849: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.71851: variable 'omit' from source: magic vars 30529 1726882601.71854: variable 'omit' from source: magic vars 30529 1726882601.71883: variable 'item' from source: unknown 30529 1726882601.72017: variable 'item' from source: unknown 30529 1726882601.72081: variable 'omit' from source: magic vars 30529 1726882601.72105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.72186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.72201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.72216: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.72224: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.72231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.72501: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.72505: Set connection var ansible_pipelining to False 30529 1726882601.72507: Set connection var ansible_shell_type to sh 30529 1726882601.72509: Set connection var ansible_timeout to 10 30529 1726882601.72511: Set connection var ansible_connection to ssh 30529 1726882601.72513: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.72515: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.72517: variable 'ansible_connection' from source: unknown 30529 1726882601.72519: variable 'ansible_module_compression' from source: unknown 30529 1726882601.72521: variable 'ansible_shell_type' from source: unknown 30529 1726882601.72523: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.72525: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.72526: variable 'ansible_pipelining' from source: unknown 30529 1726882601.72528: variable 'ansible_timeout' from source: unknown 30529 1726882601.72530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.72675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.72936: variable 'omit' from source: magic vars 30529 1726882601.72940: starting attempt loop 30529 1726882601.72942: running the handler 30529 1726882601.72944: variable 'lsr_assert' from source: include params 30529 1726882601.72946: variable 'lsr_assert' from source: include params 30529 1726882601.73001: handler run complete 30529 1726882601.73019: attempt loop complete, returning result 30529 1726882601.73039: variable 'item' from source: unknown 30529 1726882601.73105: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 30529 1726882601.73447: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.73450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.73452: variable 'omit' from source: magic vars 30529 1726882601.73899: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.74147: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.74154: variable 'omit' from source: magic vars 30529 1726882601.74157: variable 'omit' from source: magic vars 30529 1726882601.74159: variable 'item' from source: unknown 30529 1726882601.74304: variable 'item' from source: unknown 30529 1726882601.74307: variable 'omit' from source: magic vars 30529 1726882601.74324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.74335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.74345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.74376: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.74582: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.74585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.74587: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.74589: Set connection var ansible_pipelining to False 30529 1726882601.74591: Set connection var ansible_shell_type to sh 30529 1726882601.74595: Set connection var ansible_timeout to 10 30529 1726882601.74597: Set connection var ansible_connection to ssh 30529 1726882601.74599: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.74711: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.74719: variable 'ansible_connection' from source: unknown 30529 1726882601.74725: variable 'ansible_module_compression' from source: unknown 30529 1726882601.74731: variable 'ansible_shell_type' from source: unknown 30529 1726882601.74737: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.74743: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.74750: variable 'ansible_pipelining' from source: unknown 30529 1726882601.74756: variable 'ansible_timeout' from source: unknown 30529 1726882601.74763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.75017: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.75021: variable 'omit' from source: magic vars 30529 1726882601.75023: starting attempt loop 30529 1726882601.75025: running the handler 30529 1726882601.75169: handler run complete 30529 1726882601.75248: attempt loop complete, returning result 30529 1726882601.75266: variable 'item' from source: unknown 30529 1726882601.75451: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30529 1726882601.75631: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.75689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.75890: variable 'omit' from source: magic vars 30529 1726882601.76130: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.76434: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.76437: variable 'omit' from source: magic vars 30529 1726882601.76439: variable 'omit' from source: magic vars 30529 1726882601.76442: variable 'item' from source: unknown 30529 1726882601.76757: variable 'item' from source: unknown 30529 1726882601.76761: variable 'omit' from source: magic vars 30529 1726882601.76763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.76765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.76767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.76770: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.76772: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.76774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.77061: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.77278: Set connection var ansible_pipelining to False 30529 1726882601.77281: Set connection var ansible_shell_type to sh 30529 1726882601.77283: Set connection var ansible_timeout to 10 30529 1726882601.77285: Set connection var ansible_connection to ssh 30529 1726882601.77287: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.77289: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.77291: variable 'ansible_connection' from source: unknown 30529 1726882601.77295: variable 'ansible_module_compression' from source: unknown 30529 1726882601.77297: variable 'ansible_shell_type' from source: unknown 30529 1726882601.77299: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.77301: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.77303: variable 'ansible_pipelining' from source: unknown 30529 1726882601.77305: variable 'ansible_timeout' from source: unknown 30529 1726882601.77306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.77520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.77610: variable 'omit' from source: magic vars 30529 1726882601.77818: starting attempt loop 30529 1726882601.77821: running the handler 30529 1726882601.77823: variable 'lsr_fail_debug' from source: play vars 30529 1726882601.77846: variable 'lsr_fail_debug' from source: play vars 30529 1726882601.77944: handler run complete 30529 1726882601.77977: attempt loop complete, returning result 30529 1726882601.78126: variable 'item' from source: unknown 30529 1726882601.78288: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882601.78525: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.78528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.78531: variable 'omit' from source: magic vars 30529 1726882601.78940: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.78943: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.78945: variable 'omit' from source: magic vars 30529 1726882601.78947: variable 'omit' from source: magic vars 30529 1726882601.79158: variable 'item' from source: unknown 30529 1726882601.79162: variable 'item' from source: unknown 30529 1726882601.79164: variable 'omit' from source: magic vars 30529 1726882601.79166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882601.79377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.79380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882601.79387: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882601.79389: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.79391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.79394: Set connection var ansible_shell_executable to /bin/sh 30529 1726882601.79702: Set connection var ansible_pipelining to False 30529 1726882601.79705: Set connection var ansible_shell_type to sh 30529 1726882601.79707: Set connection var ansible_timeout to 10 30529 1726882601.79709: Set connection var ansible_connection to ssh 30529 1726882601.79711: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882601.79713: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.79715: variable 'ansible_connection' from source: unknown 30529 1726882601.79718: variable 'ansible_module_compression' from source: unknown 30529 1726882601.79726: variable 'ansible_shell_type' from source: unknown 30529 1726882601.79732: variable 'ansible_shell_executable' from source: unknown 30529 1726882601.79738: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.79746: variable 'ansible_pipelining' from source: unknown 30529 1726882601.79751: variable 'ansible_timeout' from source: unknown 30529 1726882601.79759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.79943: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882601.79953: variable 'omit' from source: magic vars 30529 1726882601.79960: starting attempt loop 30529 1726882601.79965: running the handler 30529 1726882601.79984: variable 'lsr_cleanup' from source: include params 30529 1726882601.80246: variable 'lsr_cleanup' from source: include params 30529 1726882601.80249: handler run complete 30529 1726882601.80251: attempt loop complete, returning result 30529 1726882601.80253: variable 'item' from source: unknown 30529 1726882601.80377: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30529 1726882601.80475: dumping result to json 30529 1726882601.80486: done dumping result, returning 30529 1726882601.80500: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-0000000005b5] 30529 1726882601.80689: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b5 30529 1726882601.80739: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b5 30529 1726882601.80741: WORKER PROCESS EXITING 30529 1726882601.80841: no more pending results, returning what we have 30529 1726882601.80844: results queue empty 30529 1726882601.80845: checking for any_errors_fatal 30529 1726882601.80852: done checking for any_errors_fatal 30529 1726882601.80852: checking for max_fail_percentage 30529 1726882601.80854: done checking for max_fail_percentage 30529 1726882601.80854: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.80855: done checking to see if all hosts have failed 30529 1726882601.80856: getting the remaining hosts for this loop 30529 1726882601.80858: done getting the remaining hosts for this loop 30529 1726882601.80862: getting the next task for host managed_node1 30529 1726882601.80869: done getting next task for host managed_node1 30529 1726882601.80872: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882601.80874: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.80877: getting variables 30529 1726882601.80879: in VariableManager get_vars() 30529 1726882601.81019: Calling all_inventory to load vars for managed_node1 30529 1726882601.81022: Calling groups_inventory to load vars for managed_node1 30529 1726882601.81027: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.81040: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.81043: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.81046: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.84380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.86491: done with get_vars() 30529 1726882601.86524: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:36:41 -0400 (0:00:00.254) 0:00:15.892 ****** 30529 1726882601.86632: entering _queue_task() for managed_node1/include_tasks 30529 1726882601.87221: worker is 1 (out of 1 available) 30529 1726882601.87230: exiting _queue_task() for managed_node1/include_tasks 30529 1726882601.87240: done queuing things up, now waiting for results queue to drain 30529 1726882601.87241: waiting for pending results... 30529 1726882601.87405: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882601.87470: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b6 30529 1726882601.87490: variable 'ansible_search_path' from source: unknown 30529 1726882601.87498: variable 'ansible_search_path' from source: unknown 30529 1726882601.87534: calling self._execute() 30529 1726882601.87638: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.87650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.87666: variable 'omit' from source: magic vars 30529 1726882601.88167: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.88187: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.88202: _execute() done 30529 1726882601.88211: dumping result to json 30529 1726882601.88221: done dumping result, returning 30529 1726882601.88338: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-0000000005b6] 30529 1726882601.88341: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b6 30529 1726882601.88703: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b6 30529 1726882601.88707: WORKER PROCESS EXITING 30529 1726882601.88739: no more pending results, returning what we have 30529 1726882601.88744: in VariableManager get_vars() 30529 1726882601.88783: Calling all_inventory to load vars for managed_node1 30529 1726882601.88786: Calling groups_inventory to load vars for managed_node1 30529 1726882601.88790: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.88833: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.88838: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.88842: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.90243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.91075: done with get_vars() 30529 1726882601.91088: variable 'ansible_search_path' from source: unknown 30529 1726882601.91089: variable 'ansible_search_path' from source: unknown 30529 1726882601.91116: we have included files to process 30529 1726882601.91117: generating all_blocks data 30529 1726882601.91119: done generating all_blocks data 30529 1726882601.91122: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882601.91123: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882601.91124: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882601.91207: in VariableManager get_vars() 30529 1726882601.91220: done with get_vars() 30529 1726882601.91335: done processing included file 30529 1726882601.91337: iterating over new_blocks loaded from include file 30529 1726882601.91339: in VariableManager get_vars() 30529 1726882601.91352: done with get_vars() 30529 1726882601.91353: filtering new block on tags 30529 1726882601.91385: done filtering new block on tags 30529 1726882601.91387: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882601.91392: extending task lists for all hosts with included blocks 30529 1726882601.91816: done extending task lists 30529 1726882601.91818: done processing included files 30529 1726882601.91818: results queue empty 30529 1726882601.91819: checking for any_errors_fatal 30529 1726882601.91824: done checking for any_errors_fatal 30529 1726882601.91824: checking for max_fail_percentage 30529 1726882601.91825: done checking for max_fail_percentage 30529 1726882601.91826: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.91827: done checking to see if all hosts have failed 30529 1726882601.91828: getting the remaining hosts for this loop 30529 1726882601.91829: done getting the remaining hosts for this loop 30529 1726882601.91831: getting the next task for host managed_node1 30529 1726882601.91836: done getting next task for host managed_node1 30529 1726882601.91838: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882601.91840: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.91843: getting variables 30529 1726882601.91844: in VariableManager get_vars() 30529 1726882601.91852: Calling all_inventory to load vars for managed_node1 30529 1726882601.91853: Calling groups_inventory to load vars for managed_node1 30529 1726882601.91856: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.91861: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.91863: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.91866: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.92631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.93454: done with get_vars() 30529 1726882601.93469: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:41 -0400 (0:00:00.068) 0:00:15.961 ****** 30529 1726882601.93522: entering _queue_task() for managed_node1/include_tasks 30529 1726882601.93768: worker is 1 (out of 1 available) 30529 1726882601.93781: exiting _queue_task() for managed_node1/include_tasks 30529 1726882601.93796: done queuing things up, now waiting for results queue to drain 30529 1726882601.93798: waiting for pending results... 30529 1726882601.93977: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882601.94061: in run() - task 12673a56-9f93-b0f1-edc0-0000000005dd 30529 1726882601.94075: variable 'ansible_search_path' from source: unknown 30529 1726882601.94079: variable 'ansible_search_path' from source: unknown 30529 1726882601.94116: calling self._execute() 30529 1726882601.94210: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882601.94214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882601.94397: variable 'omit' from source: magic vars 30529 1726882601.94597: variable 'ansible_distribution_major_version' from source: facts 30529 1726882601.94609: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882601.94615: _execute() done 30529 1726882601.94619: dumping result to json 30529 1726882601.94632: done dumping result, returning 30529 1726882601.94639: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-0000000005dd] 30529 1726882601.94644: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005dd 30529 1726882601.94736: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005dd 30529 1726882601.94740: WORKER PROCESS EXITING 30529 1726882601.94768: no more pending results, returning what we have 30529 1726882601.94774: in VariableManager get_vars() 30529 1726882601.94812: Calling all_inventory to load vars for managed_node1 30529 1726882601.94816: Calling groups_inventory to load vars for managed_node1 30529 1726882601.94819: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.94834: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.94843: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.94847: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.96519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882601.98049: done with get_vars() 30529 1726882601.98067: variable 'ansible_search_path' from source: unknown 30529 1726882601.98068: variable 'ansible_search_path' from source: unknown 30529 1726882601.98113: we have included files to process 30529 1726882601.98114: generating all_blocks data 30529 1726882601.98116: done generating all_blocks data 30529 1726882601.98117: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882601.98118: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882601.98120: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882601.98386: done processing included file 30529 1726882601.98387: iterating over new_blocks loaded from include file 30529 1726882601.98389: in VariableManager get_vars() 30529 1726882601.98403: done with get_vars() 30529 1726882601.98404: filtering new block on tags 30529 1726882601.98443: done filtering new block on tags 30529 1726882601.98445: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882601.98450: extending task lists for all hosts with included blocks 30529 1726882601.98602: done extending task lists 30529 1726882601.98603: done processing included files 30529 1726882601.98604: results queue empty 30529 1726882601.98605: checking for any_errors_fatal 30529 1726882601.98608: done checking for any_errors_fatal 30529 1726882601.98608: checking for max_fail_percentage 30529 1726882601.98609: done checking for max_fail_percentage 30529 1726882601.98610: checking to see if all hosts have failed and the running result is not ok 30529 1726882601.98611: done checking to see if all hosts have failed 30529 1726882601.98612: getting the remaining hosts for this loop 30529 1726882601.98613: done getting the remaining hosts for this loop 30529 1726882601.98616: getting the next task for host managed_node1 30529 1726882601.98620: done getting next task for host managed_node1 30529 1726882601.98622: ^ task is: TASK: Gather current interface info 30529 1726882601.98625: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882601.98627: getting variables 30529 1726882601.98635: in VariableManager get_vars() 30529 1726882601.98645: Calling all_inventory to load vars for managed_node1 30529 1726882601.98647: Calling groups_inventory to load vars for managed_node1 30529 1726882601.98649: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882601.98654: Calling all_plugins_play to load vars for managed_node1 30529 1726882601.98657: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882601.98659: Calling groups_plugins_play to load vars for managed_node1 30529 1726882601.99795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.01488: done with get_vars() 30529 1726882602.01516: done getting variables 30529 1726882602.01573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:42 -0400 (0:00:00.080) 0:00:16.042 ****** 30529 1726882602.01609: entering _queue_task() for managed_node1/command 30529 1726882602.01988: worker is 1 (out of 1 available) 30529 1726882602.02006: exiting _queue_task() for managed_node1/command 30529 1726882602.02021: done queuing things up, now waiting for results queue to drain 30529 1726882602.02023: waiting for pending results... 30529 1726882602.02312: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882602.02446: in run() - task 12673a56-9f93-b0f1-edc0-000000000618 30529 1726882602.02460: variable 'ansible_search_path' from source: unknown 30529 1726882602.02465: variable 'ansible_search_path' from source: unknown 30529 1726882602.02498: calling self._execute() 30529 1726882602.02799: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.02803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.02807: variable 'omit' from source: magic vars 30529 1726882602.02976: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.02991: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.02996: variable 'omit' from source: magic vars 30529 1726882602.03048: variable 'omit' from source: magic vars 30529 1726882602.03092: variable 'omit' from source: magic vars 30529 1726882602.03130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882602.03166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882602.03194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882602.03211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.03225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.03255: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882602.03259: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.03262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.03371: Set connection var ansible_shell_executable to /bin/sh 30529 1726882602.03374: Set connection var ansible_pipelining to False 30529 1726882602.03377: Set connection var ansible_shell_type to sh 30529 1726882602.03390: Set connection var ansible_timeout to 10 30529 1726882602.03394: Set connection var ansible_connection to ssh 30529 1726882602.03404: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882602.03428: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.03432: variable 'ansible_connection' from source: unknown 30529 1726882602.03435: variable 'ansible_module_compression' from source: unknown 30529 1726882602.03437: variable 'ansible_shell_type' from source: unknown 30529 1726882602.03439: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.03442: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.03447: variable 'ansible_pipelining' from source: unknown 30529 1726882602.03449: variable 'ansible_timeout' from source: unknown 30529 1726882602.03453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.03588: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882602.03595: variable 'omit' from source: magic vars 30529 1726882602.03601: starting attempt loop 30529 1726882602.03604: running the handler 30529 1726882602.03629: _low_level_execute_command(): starting 30529 1726882602.03637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882602.04405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882602.04509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.04522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.04575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.04578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.04631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.06318: stdout chunk (state=3): >>>/root <<< 30529 1726882602.06420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.06503: stderr chunk (state=3): >>><<< 30529 1726882602.06506: stdout chunk (state=3): >>><<< 30529 1726882602.06524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.06623: _low_level_execute_command(): starting 30529 1726882602.06628: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481 `" && echo ansible-tmp-1726882602.0653079-31269-182264781250481="` echo /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481 `" ) && sleep 0' 30529 1726882602.07218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882602.07239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882602.07353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.07380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.07461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.09312: stdout chunk (state=3): >>>ansible-tmp-1726882602.0653079-31269-182264781250481=/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481 <<< 30529 1726882602.09429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.09445: stderr chunk (state=3): >>><<< 30529 1726882602.09448: stdout chunk (state=3): >>><<< 30529 1726882602.09465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882602.0653079-31269-182264781250481=/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.09494: variable 'ansible_module_compression' from source: unknown 30529 1726882602.09536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882602.09567: variable 'ansible_facts' from source: unknown 30529 1726882602.09624: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py 30529 1726882602.09725: Sending initial data 30529 1726882602.09729: Sent initial data (156 bytes) 30529 1726882602.10174: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.10192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882602.10220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882602.10222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.10225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.10228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.10305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.10349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.11851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882602.11860: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882602.11889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882602.11927: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0vb897cr /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py <<< 30529 1726882602.11936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py" <<< 30529 1726882602.11966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0vb897cr" to remote "/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py" <<< 30529 1726882602.12512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.12522: stderr chunk (state=3): >>><<< 30529 1726882602.12525: stdout chunk (state=3): >>><<< 30529 1726882602.12562: done transferring module to remote 30529 1726882602.12574: _low_level_execute_command(): starting 30529 1726882602.12577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/ /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py && sleep 0' 30529 1726882602.12989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.12995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882602.13031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882602.13034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882602.13036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882602.13038: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.13073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.13090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.13130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.14891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.14897: stdout chunk (state=3): >>><<< 30529 1726882602.14900: stderr chunk (state=3): >>><<< 30529 1726882602.14996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.15001: _low_level_execute_command(): starting 30529 1726882602.15003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/AnsiballZ_command.py && sleep 0' 30529 1726882602.15547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882602.15561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882602.15577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.15609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.15664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.15727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.15774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.15824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.31087: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:42.305177", "end": "2024-09-20 21:36:42.308471", "delta": "0:00:00.003294", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882602.32510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.32514: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882602.32526: stderr chunk (state=3): >>><<< 30529 1726882602.32529: stdout chunk (state=3): >>><<< 30529 1726882602.32555: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:42.305177", "end": "2024-09-20 21:36:42.308471", "delta": "0:00:00.003294", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882602.32594: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882602.32707: _low_level_execute_command(): starting 30529 1726882602.32710: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882602.0653079-31269-182264781250481/ > /dev/null 2>&1 && sleep 0' 30529 1726882602.34260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.34263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.34270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.35953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.35999: stderr chunk (state=3): >>><<< 30529 1726882602.36005: stdout chunk (state=3): >>><<< 30529 1726882602.36030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.36035: handler run complete 30529 1726882602.36060: Evaluated conditional (False): False 30529 1726882602.36071: attempt loop complete, returning result 30529 1726882602.36074: _execute() done 30529 1726882602.36076: dumping result to json 30529 1726882602.36088: done dumping result, returning 30529 1726882602.36091: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-000000000618] 30529 1726882602.36096: sending task result for task 12673a56-9f93-b0f1-edc0-000000000618 30529 1726882602.36207: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000618 30529 1726882602.36211: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003294", "end": "2024-09-20 21:36:42.308471", "rc": 0, "start": "2024-09-20 21:36:42.305177" } STDOUT: bonding_masters eth0 lo 30529 1726882602.36283: no more pending results, returning what we have 30529 1726882602.36288: results queue empty 30529 1726882602.36289: checking for any_errors_fatal 30529 1726882602.36291: done checking for any_errors_fatal 30529 1726882602.36291: checking for max_fail_percentage 30529 1726882602.36495: done checking for max_fail_percentage 30529 1726882602.36497: checking to see if all hosts have failed and the running result is not ok 30529 1726882602.36498: done checking to see if all hosts have failed 30529 1726882602.36499: getting the remaining hosts for this loop 30529 1726882602.36500: done getting the remaining hosts for this loop 30529 1726882602.36504: getting the next task for host managed_node1 30529 1726882602.36511: done getting next task for host managed_node1 30529 1726882602.36513: ^ task is: TASK: Set current_interfaces 30529 1726882602.36517: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882602.36521: getting variables 30529 1726882602.36522: in VariableManager get_vars() 30529 1726882602.36548: Calling all_inventory to load vars for managed_node1 30529 1726882602.36550: Calling groups_inventory to load vars for managed_node1 30529 1726882602.36554: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.36563: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.36566: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.36568: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.37912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.39517: done with get_vars() 30529 1726882602.39542: done getting variables 30529 1726882602.39626: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:42 -0400 (0:00:00.380) 0:00:16.422 ****** 30529 1726882602.39662: entering _queue_task() for managed_node1/set_fact 30529 1726882602.40065: worker is 1 (out of 1 available) 30529 1726882602.40078: exiting _queue_task() for managed_node1/set_fact 30529 1726882602.40097: done queuing things up, now waiting for results queue to drain 30529 1726882602.40099: waiting for pending results... 30529 1726882602.40912: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882602.40918: in run() - task 12673a56-9f93-b0f1-edc0-000000000619 30529 1726882602.40936: variable 'ansible_search_path' from source: unknown 30529 1726882602.41033: variable 'ansible_search_path' from source: unknown 30529 1726882602.41154: calling self._execute() 30529 1726882602.41158: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.41258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.41311: variable 'omit' from source: magic vars 30529 1726882602.41982: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.42006: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.42022: variable 'omit' from source: magic vars 30529 1726882602.42076: variable 'omit' from source: magic vars 30529 1726882602.42199: variable '_current_interfaces' from source: set_fact 30529 1726882602.42271: variable 'omit' from source: magic vars 30529 1726882602.42370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882602.42479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882602.42483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882602.42488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.42503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.42537: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882602.42546: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.42553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.42649: Set connection var ansible_shell_executable to /bin/sh 30529 1726882602.42659: Set connection var ansible_pipelining to False 30529 1726882602.42666: Set connection var ansible_shell_type to sh 30529 1726882602.42895: Set connection var ansible_timeout to 10 30529 1726882602.42899: Set connection var ansible_connection to ssh 30529 1726882602.42901: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882602.42903: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.42906: variable 'ansible_connection' from source: unknown 30529 1726882602.42908: variable 'ansible_module_compression' from source: unknown 30529 1726882602.42910: variable 'ansible_shell_type' from source: unknown 30529 1726882602.42911: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.42913: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.42915: variable 'ansible_pipelining' from source: unknown 30529 1726882602.42917: variable 'ansible_timeout' from source: unknown 30529 1726882602.42919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.43054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882602.43070: variable 'omit' from source: magic vars 30529 1726882602.43080: starting attempt loop 30529 1726882602.43090: running the handler 30529 1726882602.43113: handler run complete 30529 1726882602.43128: attempt loop complete, returning result 30529 1726882602.43135: _execute() done 30529 1726882602.43141: dumping result to json 30529 1726882602.43148: done dumping result, returning 30529 1726882602.43218: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-000000000619] 30529 1726882602.43221: sending task result for task 12673a56-9f93-b0f1-edc0-000000000619 30529 1726882602.43420: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000619 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882602.43561: no more pending results, returning what we have 30529 1726882602.43564: results queue empty 30529 1726882602.43566: checking for any_errors_fatal 30529 1726882602.43577: done checking for any_errors_fatal 30529 1726882602.43578: checking for max_fail_percentage 30529 1726882602.43580: done checking for max_fail_percentage 30529 1726882602.43581: checking to see if all hosts have failed and the running result is not ok 30529 1726882602.43582: done checking to see if all hosts have failed 30529 1726882602.43583: getting the remaining hosts for this loop 30529 1726882602.43585: done getting the remaining hosts for this loop 30529 1726882602.43591: getting the next task for host managed_node1 30529 1726882602.43605: done getting next task for host managed_node1 30529 1726882602.43610: ^ task is: TASK: Show current_interfaces 30529 1726882602.43615: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882602.43620: getting variables 30529 1726882602.43622: in VariableManager get_vars() 30529 1726882602.43661: Calling all_inventory to load vars for managed_node1 30529 1726882602.43664: Calling groups_inventory to load vars for managed_node1 30529 1726882602.43668: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.43685: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.43691: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.43698: WORKER PROCESS EXITING 30529 1726882602.44200: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.45854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.47039: done with get_vars() 30529 1726882602.47056: done getting variables 30529 1726882602.47104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:42 -0400 (0:00:00.074) 0:00:16.497 ****** 30529 1726882602.47127: entering _queue_task() for managed_node1/debug 30529 1726882602.47383: worker is 1 (out of 1 available) 30529 1726882602.47406: exiting _queue_task() for managed_node1/debug 30529 1726882602.47418: done queuing things up, now waiting for results queue to drain 30529 1726882602.47420: waiting for pending results... 30529 1726882602.47679: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882602.47787: in run() - task 12673a56-9f93-b0f1-edc0-0000000005de 30529 1726882602.47803: variable 'ansible_search_path' from source: unknown 30529 1726882602.47807: variable 'ansible_search_path' from source: unknown 30529 1726882602.47850: calling self._execute() 30529 1726882602.47938: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.47942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.47953: variable 'omit' from source: magic vars 30529 1726882602.48418: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.48422: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.48425: variable 'omit' from source: magic vars 30529 1726882602.48526: variable 'omit' from source: magic vars 30529 1726882602.48640: variable 'current_interfaces' from source: set_fact 30529 1726882602.48644: variable 'omit' from source: magic vars 30529 1726882602.48646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882602.48677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882602.48909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882602.48913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.48916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.48918: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882602.48921: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.48923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.49077: Set connection var ansible_shell_executable to /bin/sh 30529 1726882602.49081: Set connection var ansible_pipelining to False 30529 1726882602.49084: Set connection var ansible_shell_type to sh 30529 1726882602.49097: Set connection var ansible_timeout to 10 30529 1726882602.49209: Set connection var ansible_connection to ssh 30529 1726882602.49223: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882602.49245: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.49249: variable 'ansible_connection' from source: unknown 30529 1726882602.49251: variable 'ansible_module_compression' from source: unknown 30529 1726882602.49254: variable 'ansible_shell_type' from source: unknown 30529 1726882602.49256: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.49259: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.49261: variable 'ansible_pipelining' from source: unknown 30529 1726882602.49263: variable 'ansible_timeout' from source: unknown 30529 1726882602.49268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.49679: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882602.49691: variable 'omit' from source: magic vars 30529 1726882602.49700: starting attempt loop 30529 1726882602.49703: running the handler 30529 1726882602.49774: handler run complete 30529 1726882602.49804: attempt loop complete, returning result 30529 1726882602.49807: _execute() done 30529 1726882602.49810: dumping result to json 30529 1726882602.49812: done dumping result, returning 30529 1726882602.49883: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-0000000005de] 30529 1726882602.49888: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005de 30529 1726882602.49949: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005de 30529 1726882602.49952: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882602.50038: no more pending results, returning what we have 30529 1726882602.50044: results queue empty 30529 1726882602.50045: checking for any_errors_fatal 30529 1726882602.50055: done checking for any_errors_fatal 30529 1726882602.50056: checking for max_fail_percentage 30529 1726882602.50058: done checking for max_fail_percentage 30529 1726882602.50059: checking to see if all hosts have failed and the running result is not ok 30529 1726882602.50060: done checking to see if all hosts have failed 30529 1726882602.50061: getting the remaining hosts for this loop 30529 1726882602.50063: done getting the remaining hosts for this loop 30529 1726882602.50071: getting the next task for host managed_node1 30529 1726882602.50081: done getting next task for host managed_node1 30529 1726882602.50085: ^ task is: TASK: Setup 30529 1726882602.50090: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882602.50096: getting variables 30529 1726882602.50098: in VariableManager get_vars() 30529 1726882602.50126: Calling all_inventory to load vars for managed_node1 30529 1726882602.50128: Calling groups_inventory to load vars for managed_node1 30529 1726882602.50132: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.50142: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.50145: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.50147: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.51599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.53537: done with get_vars() 30529 1726882602.53563: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:36:42 -0400 (0:00:00.065) 0:00:16.562 ****** 30529 1726882602.53669: entering _queue_task() for managed_node1/include_tasks 30529 1726882602.54033: worker is 1 (out of 1 available) 30529 1726882602.54046: exiting _queue_task() for managed_node1/include_tasks 30529 1726882602.54058: done queuing things up, now waiting for results queue to drain 30529 1726882602.54059: waiting for pending results... 30529 1726882602.54409: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882602.54601: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b7 30529 1726882602.54606: variable 'ansible_search_path' from source: unknown 30529 1726882602.54610: variable 'ansible_search_path' from source: unknown 30529 1726882602.54628: variable 'lsr_setup' from source: include params 30529 1726882602.54844: variable 'lsr_setup' from source: include params 30529 1726882602.54925: variable 'omit' from source: magic vars 30529 1726882602.55059: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.55068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.55088: variable 'omit' from source: magic vars 30529 1726882602.55401: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.55405: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.55407: variable 'item' from source: unknown 30529 1726882602.55410: variable 'item' from source: unknown 30529 1726882602.55434: variable 'item' from source: unknown 30529 1726882602.55488: variable 'item' from source: unknown 30529 1726882602.55698: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.55702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.55704: variable 'omit' from source: magic vars 30529 1726882602.55898: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.55902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.55905: variable 'item' from source: unknown 30529 1726882602.55907: variable 'item' from source: unknown 30529 1726882602.55909: variable 'item' from source: unknown 30529 1726882602.55923: variable 'item' from source: unknown 30529 1726882602.55987: dumping result to json 30529 1726882602.56030: done dumping result, returning 30529 1726882602.56034: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-0000000005b7] 30529 1726882602.56036: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b7 30529 1726882602.56092: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b7 30529 1726882602.56097: WORKER PROCESS EXITING 30529 1726882602.56174: no more pending results, returning what we have 30529 1726882602.56179: in VariableManager get_vars() 30529 1726882602.56220: Calling all_inventory to load vars for managed_node1 30529 1726882602.56223: Calling groups_inventory to load vars for managed_node1 30529 1726882602.56227: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.56241: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.56244: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.56247: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.59015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.60850: done with get_vars() 30529 1726882602.60871: variable 'ansible_search_path' from source: unknown 30529 1726882602.60872: variable 'ansible_search_path' from source: unknown 30529 1726882602.60927: variable 'ansible_search_path' from source: unknown 30529 1726882602.60928: variable 'ansible_search_path' from source: unknown 30529 1726882602.60959: we have included files to process 30529 1726882602.60960: generating all_blocks data 30529 1726882602.60962: done generating all_blocks data 30529 1726882602.60966: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882602.60967: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882602.60969: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30529 1726882602.61169: done processing included file 30529 1726882602.61171: iterating over new_blocks loaded from include file 30529 1726882602.61172: in VariableManager get_vars() 30529 1726882602.61189: done with get_vars() 30529 1726882602.61191: filtering new block on tags 30529 1726882602.61228: done filtering new block on tags 30529 1726882602.61231: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 => (item=tasks/delete_interface.yml) 30529 1726882602.61237: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882602.61238: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882602.61241: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882602.61335: in VariableManager get_vars() 30529 1726882602.61353: done with get_vars() 30529 1726882602.61448: done processing included file 30529 1726882602.61450: iterating over new_blocks loaded from include file 30529 1726882602.61452: in VariableManager get_vars() 30529 1726882602.61464: done with get_vars() 30529 1726882602.61465: filtering new block on tags 30529 1726882602.61665: done filtering new block on tags 30529 1726882602.61668: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 => (item=tasks/assert_device_absent.yml) 30529 1726882602.61671: extending task lists for all hosts with included blocks 30529 1726882602.62351: done extending task lists 30529 1726882602.62352: done processing included files 30529 1726882602.62353: results queue empty 30529 1726882602.62354: checking for any_errors_fatal 30529 1726882602.62357: done checking for any_errors_fatal 30529 1726882602.62357: checking for max_fail_percentage 30529 1726882602.62359: done checking for max_fail_percentage 30529 1726882602.62359: checking to see if all hosts have failed and the running result is not ok 30529 1726882602.62360: done checking to see if all hosts have failed 30529 1726882602.62361: getting the remaining hosts for this loop 30529 1726882602.62362: done getting the remaining hosts for this loop 30529 1726882602.62365: getting the next task for host managed_node1 30529 1726882602.62369: done getting next task for host managed_node1 30529 1726882602.62371: ^ task is: TASK: Remove test interface if necessary 30529 1726882602.62373: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882602.62376: getting variables 30529 1726882602.62377: in VariableManager get_vars() 30529 1726882602.62392: Calling all_inventory to load vars for managed_node1 30529 1726882602.62396: Calling groups_inventory to load vars for managed_node1 30529 1726882602.62398: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.62403: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.62405: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.62408: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.63832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882602.65395: done with get_vars() 30529 1726882602.65410: done getting variables 30529 1726882602.65440: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:42 -0400 (0:00:00.117) 0:00:16.680 ****** 30529 1726882602.65460: entering _queue_task() for managed_node1/command 30529 1726882602.65722: worker is 1 (out of 1 available) 30529 1726882602.65736: exiting _queue_task() for managed_node1/command 30529 1726882602.65751: done queuing things up, now waiting for results queue to drain 30529 1726882602.65753: waiting for pending results... 30529 1726882602.65931: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 30529 1726882602.66010: in run() - task 12673a56-9f93-b0f1-edc0-00000000063e 30529 1726882602.66024: variable 'ansible_search_path' from source: unknown 30529 1726882602.66028: variable 'ansible_search_path' from source: unknown 30529 1726882602.66055: calling self._execute() 30529 1726882602.66122: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.66127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.66136: variable 'omit' from source: magic vars 30529 1726882602.66402: variable 'ansible_distribution_major_version' from source: facts 30529 1726882602.66414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882602.66418: variable 'omit' from source: magic vars 30529 1726882602.66449: variable 'omit' from source: magic vars 30529 1726882602.66520: variable 'interface' from source: play vars 30529 1726882602.66533: variable 'omit' from source: magic vars 30529 1726882602.66566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882602.66598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882602.66613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882602.66627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.66639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882602.66661: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882602.66664: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.66666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.66741: Set connection var ansible_shell_executable to /bin/sh 30529 1726882602.66745: Set connection var ansible_pipelining to False 30529 1726882602.66747: Set connection var ansible_shell_type to sh 30529 1726882602.66755: Set connection var ansible_timeout to 10 30529 1726882602.66758: Set connection var ansible_connection to ssh 30529 1726882602.66762: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882602.66778: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.66781: variable 'ansible_connection' from source: unknown 30529 1726882602.66784: variable 'ansible_module_compression' from source: unknown 30529 1726882602.66788: variable 'ansible_shell_type' from source: unknown 30529 1726882602.66790: variable 'ansible_shell_executable' from source: unknown 30529 1726882602.66799: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882602.66801: variable 'ansible_pipelining' from source: unknown 30529 1726882602.66804: variable 'ansible_timeout' from source: unknown 30529 1726882602.66806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882602.66907: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882602.66914: variable 'omit' from source: magic vars 30529 1726882602.66919: starting attempt loop 30529 1726882602.66921: running the handler 30529 1726882602.66944: _low_level_execute_command(): starting 30529 1726882602.66965: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882602.67719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.67743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.67792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.69392: stdout chunk (state=3): >>>/root <<< 30529 1726882602.69491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.69521: stderr chunk (state=3): >>><<< 30529 1726882602.69525: stdout chunk (state=3): >>><<< 30529 1726882602.69544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.69555: _low_level_execute_command(): starting 30529 1726882602.69561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794 `" && echo ansible-tmp-1726882602.6954458-31320-102032033626794="` echo /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794 `" ) && sleep 0' 30529 1726882602.69976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882602.70015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882602.70019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882602.70023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.70026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882602.70035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.70078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.70082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.70129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.71975: stdout chunk (state=3): >>>ansible-tmp-1726882602.6954458-31320-102032033626794=/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794 <<< 30529 1726882602.72074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.72107: stderr chunk (state=3): >>><<< 30529 1726882602.72110: stdout chunk (state=3): >>><<< 30529 1726882602.72126: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882602.6954458-31320-102032033626794=/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.72152: variable 'ansible_module_compression' from source: unknown 30529 1726882602.72191: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882602.72228: variable 'ansible_facts' from source: unknown 30529 1726882602.72281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py 30529 1726882602.72384: Sending initial data 30529 1726882602.72388: Sent initial data (156 bytes) 30529 1726882602.72813: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.72817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.72830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.72882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.72888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.72933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.74433: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882602.74440: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882602.74472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882602.74521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpir5m4w4s /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py <<< 30529 1726882602.74529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py" <<< 30529 1726882602.74562: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpir5m4w4s" to remote "/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py" <<< 30529 1726882602.75079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.75124: stderr chunk (state=3): >>><<< 30529 1726882602.75127: stdout chunk (state=3): >>><<< 30529 1726882602.75139: done transferring module to remote 30529 1726882602.75148: _low_level_execute_command(): starting 30529 1726882602.75153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/ /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py && sleep 0' 30529 1726882602.75573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.75576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.75579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882602.75581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.75628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.75631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.75681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.77372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.77400: stderr chunk (state=3): >>><<< 30529 1726882602.77403: stdout chunk (state=3): >>><<< 30529 1726882602.77413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.77415: _low_level_execute_command(): starting 30529 1726882602.77420: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/AnsiballZ_command.py && sleep 0' 30529 1726882602.77813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.77817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.77838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.77875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.77887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.77942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.93502: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:36:42.927985", "end": "2024-09-20 21:36:42.933786", "delta": "0:00:00.005801", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882602.94831: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882602.94835: stdout chunk (state=3): >>><<< 30529 1726882602.94838: stderr chunk (state=3): >>><<< 30529 1726882602.94866: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:36:42.927985", "end": "2024-09-20 21:36:42.933786", "delta": "0:00:00.005801", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882602.94906: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882602.94910: _low_level_execute_command(): starting 30529 1726882602.94924: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882602.6954458-31320-102032033626794/ > /dev/null 2>&1 && sleep 0' 30529 1726882602.95571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882602.95584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882602.95591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882602.95698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882602.95703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882602.95724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882602.95735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882602.95754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882602.95829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882602.97607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882602.97627: stderr chunk (state=3): >>><<< 30529 1726882602.97634: stdout chunk (state=3): >>><<< 30529 1726882602.97652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882602.97660: handler run complete 30529 1726882602.97678: Evaluated conditional (False): False 30529 1726882602.97687: attempt loop complete, returning result 30529 1726882602.97692: _execute() done 30529 1726882602.97698: dumping result to json 30529 1726882602.97704: done dumping result, returning 30529 1726882602.97711: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [12673a56-9f93-b0f1-edc0-00000000063e] 30529 1726882602.97717: sending task result for task 12673a56-9f93-b0f1-edc0-00000000063e 30529 1726882602.97812: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000063e 30529 1726882602.97815: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.005801", "end": "2024-09-20 21:36:42.933786", "rc": 1, "start": "2024-09-20 21:36:42.927985" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882602.97922: no more pending results, returning what we have 30529 1726882602.97927: results queue empty 30529 1726882602.97928: checking for any_errors_fatal 30529 1726882602.97930: done checking for any_errors_fatal 30529 1726882602.97930: checking for max_fail_percentage 30529 1726882602.97932: done checking for max_fail_percentage 30529 1726882602.97932: checking to see if all hosts have failed and the running result is not ok 30529 1726882602.97933: done checking to see if all hosts have failed 30529 1726882602.97934: getting the remaining hosts for this loop 30529 1726882602.97935: done getting the remaining hosts for this loop 30529 1726882602.97939: getting the next task for host managed_node1 30529 1726882602.97948: done getting next task for host managed_node1 30529 1726882602.97951: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882602.97954: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882602.97958: getting variables 30529 1726882602.97959: in VariableManager get_vars() 30529 1726882602.97987: Calling all_inventory to load vars for managed_node1 30529 1726882602.97990: Calling groups_inventory to load vars for managed_node1 30529 1726882602.97995: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882602.98005: Calling all_plugins_play to load vars for managed_node1 30529 1726882602.98007: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882602.98009: Calling groups_plugins_play to load vars for managed_node1 30529 1726882602.98978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.01344: done with get_vars() 30529 1726882603.01376: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:43 -0400 (0:00:00.360) 0:00:17.040 ****** 30529 1726882603.01482: entering _queue_task() for managed_node1/include_tasks 30529 1726882603.01848: worker is 1 (out of 1 available) 30529 1726882603.01861: exiting _queue_task() for managed_node1/include_tasks 30529 1726882603.01873: done queuing things up, now waiting for results queue to drain 30529 1726882603.01875: waiting for pending results... 30529 1726882603.02140: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882603.02218: in run() - task 12673a56-9f93-b0f1-edc0-000000000642 30529 1726882603.02244: variable 'ansible_search_path' from source: unknown 30529 1726882603.02251: variable 'ansible_search_path' from source: unknown 30529 1726882603.02300: calling self._execute() 30529 1726882603.02407: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.02444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.02478: variable 'omit' from source: magic vars 30529 1726882603.02888: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.02891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.02898: _execute() done 30529 1726882603.02900: dumping result to json 30529 1726882603.02903: done dumping result, returning 30529 1726882603.02905: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-000000000642] 30529 1726882603.02911: sending task result for task 12673a56-9f93-b0f1-edc0-000000000642 30529 1726882603.03046: no more pending results, returning what we have 30529 1726882603.03051: in VariableManager get_vars() 30529 1726882603.03084: Calling all_inventory to load vars for managed_node1 30529 1726882603.03088: Calling groups_inventory to load vars for managed_node1 30529 1726882603.03092: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.03107: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.03111: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.03113: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.03844: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000642 30529 1726882603.03847: WORKER PROCESS EXITING 30529 1726882603.10481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.12213: done with get_vars() 30529 1726882603.12235: variable 'ansible_search_path' from source: unknown 30529 1726882603.12237: variable 'ansible_search_path' from source: unknown 30529 1726882603.12246: variable 'item' from source: include params 30529 1726882603.12333: variable 'item' from source: include params 30529 1726882603.12364: we have included files to process 30529 1726882603.12366: generating all_blocks data 30529 1726882603.12367: done generating all_blocks data 30529 1726882603.12370: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882603.12371: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882603.12373: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882603.12540: done processing included file 30529 1726882603.12542: iterating over new_blocks loaded from include file 30529 1726882603.12543: in VariableManager get_vars() 30529 1726882603.12558: done with get_vars() 30529 1726882603.12560: filtering new block on tags 30529 1726882603.12585: done filtering new block on tags 30529 1726882603.12590: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882603.12596: extending task lists for all hosts with included blocks 30529 1726882603.12758: done extending task lists 30529 1726882603.12759: done processing included files 30529 1726882603.12760: results queue empty 30529 1726882603.12761: checking for any_errors_fatal 30529 1726882603.12764: done checking for any_errors_fatal 30529 1726882603.12765: checking for max_fail_percentage 30529 1726882603.12766: done checking for max_fail_percentage 30529 1726882603.12767: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.12767: done checking to see if all hosts have failed 30529 1726882603.12768: getting the remaining hosts for this loop 30529 1726882603.12769: done getting the remaining hosts for this loop 30529 1726882603.12771: getting the next task for host managed_node1 30529 1726882603.12775: done getting next task for host managed_node1 30529 1726882603.12777: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882603.12780: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.12782: getting variables 30529 1726882603.12783: in VariableManager get_vars() 30529 1726882603.12795: Calling all_inventory to load vars for managed_node1 30529 1726882603.12798: Calling groups_inventory to load vars for managed_node1 30529 1726882603.12800: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.12805: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.12808: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.12810: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.14122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.15870: done with get_vars() 30529 1726882603.15892: done getting variables 30529 1726882603.16020: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:43 -0400 (0:00:00.145) 0:00:17.186 ****** 30529 1726882603.16046: entering _queue_task() for managed_node1/stat 30529 1726882603.16501: worker is 1 (out of 1 available) 30529 1726882603.16515: exiting _queue_task() for managed_node1/stat 30529 1726882603.16526: done queuing things up, now waiting for results queue to drain 30529 1726882603.16528: waiting for pending results... 30529 1726882603.16726: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882603.16863: in run() - task 12673a56-9f93-b0f1-edc0-000000000691 30529 1726882603.16888: variable 'ansible_search_path' from source: unknown 30529 1726882603.16899: variable 'ansible_search_path' from source: unknown 30529 1726882603.16938: calling self._execute() 30529 1726882603.17037: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.17048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.17066: variable 'omit' from source: magic vars 30529 1726882603.17501: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.17505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.17519: variable 'omit' from source: magic vars 30529 1726882603.17581: variable 'omit' from source: magic vars 30529 1726882603.17717: variable 'interface' from source: play vars 30529 1726882603.17720: variable 'omit' from source: magic vars 30529 1726882603.17763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882603.17807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882603.17836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882603.17898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.17901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.17917: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882603.17924: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.17936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.18048: Set connection var ansible_shell_executable to /bin/sh 30529 1726882603.18060: Set connection var ansible_pipelining to False 30529 1726882603.18067: Set connection var ansible_shell_type to sh 30529 1726882603.18097: Set connection var ansible_timeout to 10 30529 1726882603.18100: Set connection var ansible_connection to ssh 30529 1726882603.18105: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882603.18152: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.18155: variable 'ansible_connection' from source: unknown 30529 1726882603.18158: variable 'ansible_module_compression' from source: unknown 30529 1726882603.18160: variable 'ansible_shell_type' from source: unknown 30529 1726882603.18162: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.18164: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.18166: variable 'ansible_pipelining' from source: unknown 30529 1726882603.18262: variable 'ansible_timeout' from source: unknown 30529 1726882603.18265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.18399: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882603.18416: variable 'omit' from source: magic vars 30529 1726882603.18427: starting attempt loop 30529 1726882603.18435: running the handler 30529 1726882603.18453: _low_level_execute_command(): starting 30529 1726882603.18464: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882603.19249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882603.19290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.19371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.19416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.19470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.21137: stdout chunk (state=3): >>>/root <<< 30529 1726882603.21256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.21266: stderr chunk (state=3): >>><<< 30529 1726882603.21270: stdout chunk (state=3): >>><<< 30529 1726882603.21299: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.21309: _low_level_execute_command(): starting 30529 1726882603.21318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067 `" && echo ansible-tmp-1726882603.2129457-31344-120884364644067="` echo /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067 `" ) && sleep 0' 30529 1726882603.21866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882603.21899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.21903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.21906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882603.21909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882603.21921: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882603.21948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.21951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882603.21976: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882603.21980: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.21983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882603.21985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.22044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.22072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.22140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.23990: stdout chunk (state=3): >>>ansible-tmp-1726882603.2129457-31344-120884364644067=/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067 <<< 30529 1726882603.24096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.24116: stderr chunk (state=3): >>><<< 30529 1726882603.24119: stdout chunk (state=3): >>><<< 30529 1726882603.24133: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882603.2129457-31344-120884364644067=/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.24166: variable 'ansible_module_compression' from source: unknown 30529 1726882603.24211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882603.24244: variable 'ansible_facts' from source: unknown 30529 1726882603.24291: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py 30529 1726882603.24379: Sending initial data 30529 1726882603.24383: Sent initial data (153 bytes) 30529 1726882603.25013: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.25058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882603.25074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.25094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.25172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.26669: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882603.26674: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882603.26709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882603.26751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3libs2vz /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py <<< 30529 1726882603.26760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py" <<< 30529 1726882603.26798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3libs2vz" to remote "/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py" <<< 30529 1726882603.26802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py" <<< 30529 1726882603.27329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.27363: stderr chunk (state=3): >>><<< 30529 1726882603.27366: stdout chunk (state=3): >>><<< 30529 1726882603.27390: done transferring module to remote 30529 1726882603.27397: _low_level_execute_command(): starting 30529 1726882603.27401: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/ /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py && sleep 0' 30529 1726882603.27821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.27824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882603.27826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.27828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.27833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882603.27835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.27879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882603.27886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.27924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.29630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.29650: stderr chunk (state=3): >>><<< 30529 1726882603.29655: stdout chunk (state=3): >>><<< 30529 1726882603.29668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.29671: _low_level_execute_command(): starting 30529 1726882603.29675: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/AnsiballZ_stat.py && sleep 0' 30529 1726882603.30054: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.30058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.30071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.30118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.30136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.30177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.45153: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882603.46336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882603.46355: stderr chunk (state=3): >>><<< 30529 1726882603.46359: stdout chunk (state=3): >>><<< 30529 1726882603.46374: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882603.46403: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882603.46413: _low_level_execute_command(): starting 30529 1726882603.46418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882603.2129457-31344-120884364644067/ > /dev/null 2>&1 && sleep 0' 30529 1726882603.46860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.46864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.46866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882603.46868: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.46870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.46926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882603.46933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.46935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.46973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.48754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.48774: stderr chunk (state=3): >>><<< 30529 1726882603.48777: stdout chunk (state=3): >>><<< 30529 1726882603.48791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.48796: handler run complete 30529 1726882603.48813: attempt loop complete, returning result 30529 1726882603.48816: _execute() done 30529 1726882603.48819: dumping result to json 30529 1726882603.48823: done dumping result, returning 30529 1726882603.48831: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000000691] 30529 1726882603.48834: sending task result for task 12673a56-9f93-b0f1-edc0-000000000691 30529 1726882603.48923: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000691 30529 1726882603.48925: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882603.48980: no more pending results, returning what we have 30529 1726882603.48984: results queue empty 30529 1726882603.48985: checking for any_errors_fatal 30529 1726882603.48988: done checking for any_errors_fatal 30529 1726882603.48989: checking for max_fail_percentage 30529 1726882603.48991: done checking for max_fail_percentage 30529 1726882603.48991: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.48992: done checking to see if all hosts have failed 30529 1726882603.48995: getting the remaining hosts for this loop 30529 1726882603.48996: done getting the remaining hosts for this loop 30529 1726882603.49000: getting the next task for host managed_node1 30529 1726882603.49009: done getting next task for host managed_node1 30529 1726882603.49011: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30529 1726882603.49015: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.49019: getting variables 30529 1726882603.49021: in VariableManager get_vars() 30529 1726882603.49051: Calling all_inventory to load vars for managed_node1 30529 1726882603.49054: Calling groups_inventory to load vars for managed_node1 30529 1726882603.49057: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.49067: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.49070: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.49073: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.49946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.50818: done with get_vars() 30529 1726882603.50836: done getting variables 30529 1726882603.50877: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882603.50966: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:43 -0400 (0:00:00.349) 0:00:17.536 ****** 30529 1726882603.50991: entering _queue_task() for managed_node1/assert 30529 1726882603.51213: worker is 1 (out of 1 available) 30529 1726882603.51225: exiting _queue_task() for managed_node1/assert 30529 1726882603.51238: done queuing things up, now waiting for results queue to drain 30529 1726882603.51239: waiting for pending results... 30529 1726882603.51416: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' 30529 1726882603.51481: in run() - task 12673a56-9f93-b0f1-edc0-000000000643 30529 1726882603.51495: variable 'ansible_search_path' from source: unknown 30529 1726882603.51499: variable 'ansible_search_path' from source: unknown 30529 1726882603.51526: calling self._execute() 30529 1726882603.51596: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.51601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.51610: variable 'omit' from source: magic vars 30529 1726882603.51866: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.51875: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.51881: variable 'omit' from source: magic vars 30529 1726882603.51916: variable 'omit' from source: magic vars 30529 1726882603.51980: variable 'interface' from source: play vars 30529 1726882603.51995: variable 'omit' from source: magic vars 30529 1726882603.52027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882603.52055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882603.52070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882603.52083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.52096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.52120: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882603.52124: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.52126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.52198: Set connection var ansible_shell_executable to /bin/sh 30529 1726882603.52201: Set connection var ansible_pipelining to False 30529 1726882603.52204: Set connection var ansible_shell_type to sh 30529 1726882603.52212: Set connection var ansible_timeout to 10 30529 1726882603.52215: Set connection var ansible_connection to ssh 30529 1726882603.52220: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882603.52237: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.52240: variable 'ansible_connection' from source: unknown 30529 1726882603.52243: variable 'ansible_module_compression' from source: unknown 30529 1726882603.52245: variable 'ansible_shell_type' from source: unknown 30529 1726882603.52247: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.52249: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.52253: variable 'ansible_pipelining' from source: unknown 30529 1726882603.52257: variable 'ansible_timeout' from source: unknown 30529 1726882603.52259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.52357: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882603.52366: variable 'omit' from source: magic vars 30529 1726882603.52377: starting attempt loop 30529 1726882603.52380: running the handler 30529 1726882603.52470: variable 'interface_stat' from source: set_fact 30529 1726882603.52480: Evaluated conditional (not interface_stat.stat.exists): True 30529 1726882603.52483: handler run complete 30529 1726882603.52498: attempt loop complete, returning result 30529 1726882603.52501: _execute() done 30529 1726882603.52504: dumping result to json 30529 1726882603.52506: done dumping result, returning 30529 1726882603.52511: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' [12673a56-9f93-b0f1-edc0-000000000643] 30529 1726882603.52516: sending task result for task 12673a56-9f93-b0f1-edc0-000000000643 30529 1726882603.52597: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000643 30529 1726882603.52600: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882603.52642: no more pending results, returning what we have 30529 1726882603.52645: results queue empty 30529 1726882603.52646: checking for any_errors_fatal 30529 1726882603.52656: done checking for any_errors_fatal 30529 1726882603.52657: checking for max_fail_percentage 30529 1726882603.52659: done checking for max_fail_percentage 30529 1726882603.52659: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.52660: done checking to see if all hosts have failed 30529 1726882603.52661: getting the remaining hosts for this loop 30529 1726882603.52663: done getting the remaining hosts for this loop 30529 1726882603.52666: getting the next task for host managed_node1 30529 1726882603.52672: done getting next task for host managed_node1 30529 1726882603.52675: ^ task is: TASK: Test 30529 1726882603.52677: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.52681: getting variables 30529 1726882603.52682: in VariableManager get_vars() 30529 1726882603.52716: Calling all_inventory to load vars for managed_node1 30529 1726882603.52718: Calling groups_inventory to load vars for managed_node1 30529 1726882603.52721: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.52730: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.52732: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.52735: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.53473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.54340: done with get_vars() 30529 1726882603.54353: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:36:43 -0400 (0:00:00.034) 0:00:17.570 ****** 30529 1726882603.54417: entering _queue_task() for managed_node1/include_tasks 30529 1726882603.54615: worker is 1 (out of 1 available) 30529 1726882603.54629: exiting _queue_task() for managed_node1/include_tasks 30529 1726882603.54641: done queuing things up, now waiting for results queue to drain 30529 1726882603.54643: waiting for pending results... 30529 1726882603.54802: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882603.54861: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b8 30529 1726882603.54876: variable 'ansible_search_path' from source: unknown 30529 1726882603.54879: variable 'ansible_search_path' from source: unknown 30529 1726882603.54911: variable 'lsr_test' from source: include params 30529 1726882603.55052: variable 'lsr_test' from source: include params 30529 1726882603.55103: variable 'omit' from source: magic vars 30529 1726882603.55184: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.55192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.55204: variable 'omit' from source: magic vars 30529 1726882603.55358: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.55365: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.55371: variable 'item' from source: unknown 30529 1726882603.55422: variable 'item' from source: unknown 30529 1726882603.55442: variable 'item' from source: unknown 30529 1726882603.55483: variable 'item' from source: unknown 30529 1726882603.55606: dumping result to json 30529 1726882603.55609: done dumping result, returning 30529 1726882603.55611: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-0000000005b8] 30529 1726882603.55613: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b8 30529 1726882603.55652: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b8 30529 1726882603.55654: WORKER PROCESS EXITING 30529 1726882603.55673: no more pending results, returning what we have 30529 1726882603.55677: in VariableManager get_vars() 30529 1726882603.55706: Calling all_inventory to load vars for managed_node1 30529 1726882603.55709: Calling groups_inventory to load vars for managed_node1 30529 1726882603.55712: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.55721: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.55723: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.55725: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.56549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.57383: done with get_vars() 30529 1726882603.57400: variable 'ansible_search_path' from source: unknown 30529 1726882603.57401: variable 'ansible_search_path' from source: unknown 30529 1726882603.57425: we have included files to process 30529 1726882603.57425: generating all_blocks data 30529 1726882603.57426: done generating all_blocks data 30529 1726882603.57430: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30529 1726882603.57430: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30529 1726882603.57432: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30529 1726882603.57627: done processing included file 30529 1726882603.57628: iterating over new_blocks loaded from include file 30529 1726882603.57629: in VariableManager get_vars() 30529 1726882603.57638: done with get_vars() 30529 1726882603.57639: filtering new block on tags 30529 1726882603.57660: done filtering new block on tags 30529 1726882603.57661: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node1 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 30529 1726882603.57664: extending task lists for all hosts with included blocks 30529 1726882603.58124: done extending task lists 30529 1726882603.58125: done processing included files 30529 1726882603.58126: results queue empty 30529 1726882603.58126: checking for any_errors_fatal 30529 1726882603.58128: done checking for any_errors_fatal 30529 1726882603.58129: checking for max_fail_percentage 30529 1726882603.58130: done checking for max_fail_percentage 30529 1726882603.58130: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.58130: done checking to see if all hosts have failed 30529 1726882603.58131: getting the remaining hosts for this loop 30529 1726882603.58132: done getting the remaining hosts for this loop 30529 1726882603.58133: getting the next task for host managed_node1 30529 1726882603.58137: done getting next task for host managed_node1 30529 1726882603.58138: ^ task is: TASK: Include network role 30529 1726882603.58139: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.58141: getting variables 30529 1726882603.58142: in VariableManager get_vars() 30529 1726882603.58149: Calling all_inventory to load vars for managed_node1 30529 1726882603.58151: Calling groups_inventory to load vars for managed_node1 30529 1726882603.58152: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.58156: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.58157: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.58159: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.58824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.59655: done with get_vars() 30529 1726882603.59671: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 21:36:43 -0400 (0:00:00.053) 0:00:17.623 ****** 30529 1726882603.59720: entering _queue_task() for managed_node1/include_role 30529 1726882603.59928: worker is 1 (out of 1 available) 30529 1726882603.59940: exiting _queue_task() for managed_node1/include_role 30529 1726882603.59952: done queuing things up, now waiting for results queue to drain 30529 1726882603.59953: waiting for pending results... 30529 1726882603.60133: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882603.60213: in run() - task 12673a56-9f93-b0f1-edc0-0000000006b1 30529 1726882603.60223: variable 'ansible_search_path' from source: unknown 30529 1726882603.60226: variable 'ansible_search_path' from source: unknown 30529 1726882603.60261: calling self._execute() 30529 1726882603.60334: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.60338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.60346: variable 'omit' from source: magic vars 30529 1726882603.60605: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.60618: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.60621: _execute() done 30529 1726882603.60625: dumping result to json 30529 1726882603.60627: done dumping result, returning 30529 1726882603.60633: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-0000000006b1] 30529 1726882603.60638: sending task result for task 12673a56-9f93-b0f1-edc0-0000000006b1 30529 1726882603.60742: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000006b1 30529 1726882603.60768: no more pending results, returning what we have 30529 1726882603.60773: in VariableManager get_vars() 30529 1726882603.60806: Calling all_inventory to load vars for managed_node1 30529 1726882603.60809: Calling groups_inventory to load vars for managed_node1 30529 1726882603.60812: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.60824: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.60826: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.60829: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.61571: WORKER PROCESS EXITING 30529 1726882603.61581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.62440: done with get_vars() 30529 1726882603.62453: variable 'ansible_search_path' from source: unknown 30529 1726882603.62454: variable 'ansible_search_path' from source: unknown 30529 1726882603.62561: variable 'omit' from source: magic vars 30529 1726882603.62589: variable 'omit' from source: magic vars 30529 1726882603.62600: variable 'omit' from source: magic vars 30529 1726882603.62603: we have included files to process 30529 1726882603.62603: generating all_blocks data 30529 1726882603.62604: done generating all_blocks data 30529 1726882603.62605: processing included file: fedora.linux_system_roles.network 30529 1726882603.62619: in VariableManager get_vars() 30529 1726882603.62627: done with get_vars() 30529 1726882603.62645: in VariableManager get_vars() 30529 1726882603.62656: done with get_vars() 30529 1726882603.62682: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882603.62752: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882603.62806: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882603.63060: in VariableManager get_vars() 30529 1726882603.63072: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882603.64296: iterating over new_blocks loaded from include file 30529 1726882603.64298: in VariableManager get_vars() 30529 1726882603.64309: done with get_vars() 30529 1726882603.64310: filtering new block on tags 30529 1726882603.64506: done filtering new block on tags 30529 1726882603.64508: in VariableManager get_vars() 30529 1726882603.64517: done with get_vars() 30529 1726882603.64518: filtering new block on tags 30529 1726882603.64528: done filtering new block on tags 30529 1726882603.64529: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882603.64532: extending task lists for all hosts with included blocks 30529 1726882603.64624: done extending task lists 30529 1726882603.64625: done processing included files 30529 1726882603.64626: results queue empty 30529 1726882603.64626: checking for any_errors_fatal 30529 1726882603.64629: done checking for any_errors_fatal 30529 1726882603.64629: checking for max_fail_percentage 30529 1726882603.64630: done checking for max_fail_percentage 30529 1726882603.64630: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.64631: done checking to see if all hosts have failed 30529 1726882603.64631: getting the remaining hosts for this loop 30529 1726882603.64632: done getting the remaining hosts for this loop 30529 1726882603.64634: getting the next task for host managed_node1 30529 1726882603.64637: done getting next task for host managed_node1 30529 1726882603.64638: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882603.64640: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.64646: getting variables 30529 1726882603.64647: in VariableManager get_vars() 30529 1726882603.64654: Calling all_inventory to load vars for managed_node1 30529 1726882603.64656: Calling groups_inventory to load vars for managed_node1 30529 1726882603.64657: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.64660: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.64661: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.64663: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.65279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.66121: done with get_vars() 30529 1726882603.66135: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:43 -0400 (0:00:00.064) 0:00:17.687 ****** 30529 1726882603.66181: entering _queue_task() for managed_node1/include_tasks 30529 1726882603.66418: worker is 1 (out of 1 available) 30529 1726882603.66432: exiting _queue_task() for managed_node1/include_tasks 30529 1726882603.66444: done queuing things up, now waiting for results queue to drain 30529 1726882603.66445: waiting for pending results... 30529 1726882603.66618: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882603.66696: in run() - task 12673a56-9f93-b0f1-edc0-00000000072f 30529 1726882603.66707: variable 'ansible_search_path' from source: unknown 30529 1726882603.66710: variable 'ansible_search_path' from source: unknown 30529 1726882603.66737: calling self._execute() 30529 1726882603.66805: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.66808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.66816: variable 'omit' from source: magic vars 30529 1726882603.67071: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.67081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.67089: _execute() done 30529 1726882603.67092: dumping result to json 30529 1726882603.67097: done dumping result, returning 30529 1726882603.67100: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-00000000072f] 30529 1726882603.67111: sending task result for task 12673a56-9f93-b0f1-edc0-00000000072f 30529 1726882603.67189: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000072f 30529 1726882603.67191: WORKER PROCESS EXITING 30529 1726882603.67250: no more pending results, returning what we have 30529 1726882603.67255: in VariableManager get_vars() 30529 1726882603.67291: Calling all_inventory to load vars for managed_node1 30529 1726882603.67300: Calling groups_inventory to load vars for managed_node1 30529 1726882603.67303: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.67311: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.67313: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.67316: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.68131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.68988: done with get_vars() 30529 1726882603.69002: variable 'ansible_search_path' from source: unknown 30529 1726882603.69003: variable 'ansible_search_path' from source: unknown 30529 1726882603.69029: we have included files to process 30529 1726882603.69030: generating all_blocks data 30529 1726882603.69032: done generating all_blocks data 30529 1726882603.69035: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882603.69036: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882603.69037: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882603.69403: done processing included file 30529 1726882603.69404: iterating over new_blocks loaded from include file 30529 1726882603.69405: in VariableManager get_vars() 30529 1726882603.69420: done with get_vars() 30529 1726882603.69421: filtering new block on tags 30529 1726882603.69439: done filtering new block on tags 30529 1726882603.69440: in VariableManager get_vars() 30529 1726882603.69453: done with get_vars() 30529 1726882603.69454: filtering new block on tags 30529 1726882603.69481: done filtering new block on tags 30529 1726882603.69483: in VariableManager get_vars() 30529 1726882603.69499: done with get_vars() 30529 1726882603.69500: filtering new block on tags 30529 1726882603.69523: done filtering new block on tags 30529 1726882603.69525: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882603.69528: extending task lists for all hosts with included blocks 30529 1726882603.70464: done extending task lists 30529 1726882603.70465: done processing included files 30529 1726882603.70466: results queue empty 30529 1726882603.70466: checking for any_errors_fatal 30529 1726882603.70468: done checking for any_errors_fatal 30529 1726882603.70468: checking for max_fail_percentage 30529 1726882603.70469: done checking for max_fail_percentage 30529 1726882603.70470: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.70470: done checking to see if all hosts have failed 30529 1726882603.70471: getting the remaining hosts for this loop 30529 1726882603.70472: done getting the remaining hosts for this loop 30529 1726882603.70473: getting the next task for host managed_node1 30529 1726882603.70477: done getting next task for host managed_node1 30529 1726882603.70478: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882603.70481: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.70489: getting variables 30529 1726882603.70490: in VariableManager get_vars() 30529 1726882603.70499: Calling all_inventory to load vars for managed_node1 30529 1726882603.70501: Calling groups_inventory to load vars for managed_node1 30529 1726882603.70502: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.70505: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.70506: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.70508: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.71120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.71958: done with get_vars() 30529 1726882603.71974: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:43 -0400 (0:00:00.058) 0:00:17.746 ****** 30529 1726882603.72026: entering _queue_task() for managed_node1/setup 30529 1726882603.72258: worker is 1 (out of 1 available) 30529 1726882603.72271: exiting _queue_task() for managed_node1/setup 30529 1726882603.72284: done queuing things up, now waiting for results queue to drain 30529 1726882603.72288: waiting for pending results... 30529 1726882603.72453: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882603.72548: in run() - task 12673a56-9f93-b0f1-edc0-00000000078c 30529 1726882603.72560: variable 'ansible_search_path' from source: unknown 30529 1726882603.72564: variable 'ansible_search_path' from source: unknown 30529 1726882603.72594: calling self._execute() 30529 1726882603.72655: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.72659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.72667: variable 'omit' from source: magic vars 30529 1726882603.72923: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.72932: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.73076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882603.74568: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882603.74615: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882603.74642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882603.74667: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882603.74692: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882603.74747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882603.74767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882603.74784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882603.74817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882603.74828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882603.74864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882603.74880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882603.74899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882603.74927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882603.74938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882603.75045: variable '__network_required_facts' from source: role '' defaults 30529 1726882603.75051: variable 'ansible_facts' from source: unknown 30529 1726882603.75460: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882603.75464: when evaluation is False, skipping this task 30529 1726882603.75466: _execute() done 30529 1726882603.75469: dumping result to json 30529 1726882603.75471: done dumping result, returning 30529 1726882603.75477: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-00000000078c] 30529 1726882603.75480: sending task result for task 12673a56-9f93-b0f1-edc0-00000000078c 30529 1726882603.75562: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000078c 30529 1726882603.75565: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882603.75611: no more pending results, returning what we have 30529 1726882603.75615: results queue empty 30529 1726882603.75616: checking for any_errors_fatal 30529 1726882603.75617: done checking for any_errors_fatal 30529 1726882603.75618: checking for max_fail_percentage 30529 1726882603.75619: done checking for max_fail_percentage 30529 1726882603.75620: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.75621: done checking to see if all hosts have failed 30529 1726882603.75622: getting the remaining hosts for this loop 30529 1726882603.75623: done getting the remaining hosts for this loop 30529 1726882603.75627: getting the next task for host managed_node1 30529 1726882603.75638: done getting next task for host managed_node1 30529 1726882603.75641: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882603.75647: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.75662: getting variables 30529 1726882603.75663: in VariableManager get_vars() 30529 1726882603.75712: Calling all_inventory to load vars for managed_node1 30529 1726882603.75715: Calling groups_inventory to load vars for managed_node1 30529 1726882603.75717: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.75726: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.75728: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.75736: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.76558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.77430: done with get_vars() 30529 1726882603.77444: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:43 -0400 (0:00:00.054) 0:00:17.801 ****** 30529 1726882603.77510: entering _queue_task() for managed_node1/stat 30529 1726882603.77724: worker is 1 (out of 1 available) 30529 1726882603.77740: exiting _queue_task() for managed_node1/stat 30529 1726882603.77751: done queuing things up, now waiting for results queue to drain 30529 1726882603.77753: waiting for pending results... 30529 1726882603.77913: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882603.78000: in run() - task 12673a56-9f93-b0f1-edc0-00000000078e 30529 1726882603.78013: variable 'ansible_search_path' from source: unknown 30529 1726882603.78017: variable 'ansible_search_path' from source: unknown 30529 1726882603.78043: calling self._execute() 30529 1726882603.78111: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.78115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.78123: variable 'omit' from source: magic vars 30529 1726882603.78369: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.78379: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.78494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882603.78672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882603.78705: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882603.78728: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882603.78755: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882603.78815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882603.78833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882603.78856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882603.78872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882603.78934: variable '__network_is_ostree' from source: set_fact 30529 1726882603.78938: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882603.78941: when evaluation is False, skipping this task 30529 1726882603.78943: _execute() done 30529 1726882603.78946: dumping result to json 30529 1726882603.78950: done dumping result, returning 30529 1726882603.78958: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-00000000078e] 30529 1726882603.78961: sending task result for task 12673a56-9f93-b0f1-edc0-00000000078e 30529 1726882603.79042: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000078e 30529 1726882603.79045: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882603.79117: no more pending results, returning what we have 30529 1726882603.79120: results queue empty 30529 1726882603.79121: checking for any_errors_fatal 30529 1726882603.79126: done checking for any_errors_fatal 30529 1726882603.79127: checking for max_fail_percentage 30529 1726882603.79128: done checking for max_fail_percentage 30529 1726882603.79129: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.79130: done checking to see if all hosts have failed 30529 1726882603.79130: getting the remaining hosts for this loop 30529 1726882603.79132: done getting the remaining hosts for this loop 30529 1726882603.79135: getting the next task for host managed_node1 30529 1726882603.79140: done getting next task for host managed_node1 30529 1726882603.79143: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882603.79147: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.79161: getting variables 30529 1726882603.79162: in VariableManager get_vars() 30529 1726882603.79192: Calling all_inventory to load vars for managed_node1 30529 1726882603.79196: Calling groups_inventory to load vars for managed_node1 30529 1726882603.79198: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.79206: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.79208: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.79211: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.79913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.80854: done with get_vars() 30529 1726882603.80868: done getting variables 30529 1726882603.80910: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:43 -0400 (0:00:00.034) 0:00:17.835 ****** 30529 1726882603.80934: entering _queue_task() for managed_node1/set_fact 30529 1726882603.81130: worker is 1 (out of 1 available) 30529 1726882603.81144: exiting _queue_task() for managed_node1/set_fact 30529 1726882603.81155: done queuing things up, now waiting for results queue to drain 30529 1726882603.81157: waiting for pending results... 30529 1726882603.81315: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882603.81397: in run() - task 12673a56-9f93-b0f1-edc0-00000000078f 30529 1726882603.81410: variable 'ansible_search_path' from source: unknown 30529 1726882603.81413: variable 'ansible_search_path' from source: unknown 30529 1726882603.81438: calling self._execute() 30529 1726882603.81501: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.81505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.81515: variable 'omit' from source: magic vars 30529 1726882603.81763: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.81772: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.81882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882603.82064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882603.82095: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882603.82120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882603.82145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882603.82206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882603.82223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882603.82241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882603.82260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882603.82322: variable '__network_is_ostree' from source: set_fact 30529 1726882603.82327: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882603.82330: when evaluation is False, skipping this task 30529 1726882603.82333: _execute() done 30529 1726882603.82335: dumping result to json 30529 1726882603.82338: done dumping result, returning 30529 1726882603.82345: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-00000000078f] 30529 1726882603.82348: sending task result for task 12673a56-9f93-b0f1-edc0-00000000078f 30529 1726882603.82431: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000078f 30529 1726882603.82434: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882603.82506: no more pending results, returning what we have 30529 1726882603.82509: results queue empty 30529 1726882603.82510: checking for any_errors_fatal 30529 1726882603.82514: done checking for any_errors_fatal 30529 1726882603.82515: checking for max_fail_percentage 30529 1726882603.82516: done checking for max_fail_percentage 30529 1726882603.82517: checking to see if all hosts have failed and the running result is not ok 30529 1726882603.82518: done checking to see if all hosts have failed 30529 1726882603.82519: getting the remaining hosts for this loop 30529 1726882603.82520: done getting the remaining hosts for this loop 30529 1726882603.82523: getting the next task for host managed_node1 30529 1726882603.82530: done getting next task for host managed_node1 30529 1726882603.82534: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882603.82538: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882603.82552: getting variables 30529 1726882603.82554: in VariableManager get_vars() 30529 1726882603.82580: Calling all_inventory to load vars for managed_node1 30529 1726882603.82582: Calling groups_inventory to load vars for managed_node1 30529 1726882603.82584: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882603.82595: Calling all_plugins_play to load vars for managed_node1 30529 1726882603.82597: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882603.82599: Calling groups_plugins_play to load vars for managed_node1 30529 1726882603.83380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882603.84236: done with get_vars() 30529 1726882603.84251: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:43 -0400 (0:00:00.033) 0:00:17.869 ****** 30529 1726882603.84314: entering _queue_task() for managed_node1/service_facts 30529 1726882603.84500: worker is 1 (out of 1 available) 30529 1726882603.84514: exiting _queue_task() for managed_node1/service_facts 30529 1726882603.84527: done queuing things up, now waiting for results queue to drain 30529 1726882603.84528: waiting for pending results... 30529 1726882603.84940: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882603.84945: in run() - task 12673a56-9f93-b0f1-edc0-000000000791 30529 1726882603.84948: variable 'ansible_search_path' from source: unknown 30529 1726882603.84951: variable 'ansible_search_path' from source: unknown 30529 1726882603.84954: calling self._execute() 30529 1726882603.84990: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.84996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.85006: variable 'omit' from source: magic vars 30529 1726882603.85373: variable 'ansible_distribution_major_version' from source: facts 30529 1726882603.85384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882603.85389: variable 'omit' from source: magic vars 30529 1726882603.85475: variable 'omit' from source: magic vars 30529 1726882603.85509: variable 'omit' from source: magic vars 30529 1726882603.85544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882603.85589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882603.85607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882603.85625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.85636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882603.85674: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882603.85677: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.85680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.85907: Set connection var ansible_shell_executable to /bin/sh 30529 1726882603.85911: Set connection var ansible_pipelining to False 30529 1726882603.85913: Set connection var ansible_shell_type to sh 30529 1726882603.85916: Set connection var ansible_timeout to 10 30529 1726882603.85918: Set connection var ansible_connection to ssh 30529 1726882603.85920: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882603.85922: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.85924: variable 'ansible_connection' from source: unknown 30529 1726882603.85927: variable 'ansible_module_compression' from source: unknown 30529 1726882603.85929: variable 'ansible_shell_type' from source: unknown 30529 1726882603.85931: variable 'ansible_shell_executable' from source: unknown 30529 1726882603.85932: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882603.85934: variable 'ansible_pipelining' from source: unknown 30529 1726882603.85936: variable 'ansible_timeout' from source: unknown 30529 1726882603.85938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882603.86043: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882603.86052: variable 'omit' from source: magic vars 30529 1726882603.86057: starting attempt loop 30529 1726882603.86059: running the handler 30529 1726882603.86073: _low_level_execute_command(): starting 30529 1726882603.86081: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882603.86691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.86710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.86759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882603.86773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.86818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.88427: stdout chunk (state=3): >>>/root <<< 30529 1726882603.88663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.88667: stdout chunk (state=3): >>><<< 30529 1726882603.88670: stderr chunk (state=3): >>><<< 30529 1726882603.88674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.88676: _low_level_execute_command(): starting 30529 1726882603.88678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994 `" && echo ansible-tmp-1726882603.8858576-31370-2050184857994="` echo /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994 `" ) && sleep 0' 30529 1726882603.89183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.89239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.89262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.89337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.91197: stdout chunk (state=3): >>>ansible-tmp-1726882603.8858576-31370-2050184857994=/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994 <<< 30529 1726882603.91332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.91335: stdout chunk (state=3): >>><<< 30529 1726882603.91337: stderr chunk (state=3): >>><<< 30529 1726882603.91349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882603.8858576-31370-2050184857994=/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.91498: variable 'ansible_module_compression' from source: unknown 30529 1726882603.91501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882603.91503: variable 'ansible_facts' from source: unknown 30529 1726882603.91564: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py 30529 1726882603.91713: Sending initial data 30529 1726882603.91722: Sent initial data (160 bytes) 30529 1726882603.92242: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882603.92256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.92272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882603.92308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882603.92321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882603.92409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.92438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.92503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.94050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882603.94106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882603.94173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpbm384iz9 /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py <<< 30529 1726882603.94183: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py" <<< 30529 1726882603.94213: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30529 1726882603.94240: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpbm384iz9" to remote "/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py" <<< 30529 1726882603.95099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.95104: stdout chunk (state=3): >>><<< 30529 1726882603.95106: stderr chunk (state=3): >>><<< 30529 1726882603.95117: done transferring module to remote 30529 1726882603.95130: _low_level_execute_command(): starting 30529 1726882603.95138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/ /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py && sleep 0' 30529 1726882603.95755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882603.95773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.95809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882603.95883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.95906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.95980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882603.97730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882603.97734: stdout chunk (state=3): >>><<< 30529 1726882603.97736: stderr chunk (state=3): >>><<< 30529 1726882603.97835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882603.97839: _low_level_execute_command(): starting 30529 1726882603.97842: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/AnsiballZ_service_facts.py && sleep 0' 30529 1726882603.98407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882603.98512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882603.98531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882603.98551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882603.98623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.49509: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882605.49565: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882605.51499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882605.51503: stdout chunk (state=3): >>><<< 30529 1726882605.51506: stderr chunk (state=3): >>><<< 30529 1726882605.51510: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882605.53322: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882605.53344: _low_level_execute_command(): starting 30529 1726882605.53476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882603.8858576-31370-2050184857994/ > /dev/null 2>&1 && sleep 0' 30529 1726882605.54673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.54913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.54987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.57100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882605.57105: stdout chunk (state=3): >>><<< 30529 1726882605.57108: stderr chunk (state=3): >>><<< 30529 1726882605.57111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882605.57113: handler run complete 30529 1726882605.57267: variable 'ansible_facts' from source: unknown 30529 1726882605.57596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882605.58701: variable 'ansible_facts' from source: unknown 30529 1726882605.59098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882605.59299: attempt loop complete, returning result 30529 1726882605.59310: _execute() done 30529 1726882605.59317: dumping result to json 30529 1726882605.59380: done dumping result, returning 30529 1726882605.59395: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000000791] 30529 1726882605.59404: sending task result for task 12673a56-9f93-b0f1-edc0-000000000791 30529 1726882605.61100: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000791 30529 1726882605.61103: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882605.61228: no more pending results, returning what we have 30529 1726882605.61230: results queue empty 30529 1726882605.61231: checking for any_errors_fatal 30529 1726882605.61234: done checking for any_errors_fatal 30529 1726882605.61235: checking for max_fail_percentage 30529 1726882605.61236: done checking for max_fail_percentage 30529 1726882605.61237: checking to see if all hosts have failed and the running result is not ok 30529 1726882605.61238: done checking to see if all hosts have failed 30529 1726882605.61239: getting the remaining hosts for this loop 30529 1726882605.61240: done getting the remaining hosts for this loop 30529 1726882605.61244: getting the next task for host managed_node1 30529 1726882605.61250: done getting next task for host managed_node1 30529 1726882605.61253: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882605.61259: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882605.61268: getting variables 30529 1726882605.61269: in VariableManager get_vars() 30529 1726882605.61299: Calling all_inventory to load vars for managed_node1 30529 1726882605.61302: Calling groups_inventory to load vars for managed_node1 30529 1726882605.61304: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882605.61312: Calling all_plugins_play to load vars for managed_node1 30529 1726882605.61315: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882605.61318: Calling groups_plugins_play to load vars for managed_node1 30529 1726882605.63623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882605.65685: done with get_vars() 30529 1726882605.65716: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:45 -0400 (0:00:01.815) 0:00:19.684 ****** 30529 1726882605.65830: entering _queue_task() for managed_node1/package_facts 30529 1726882605.66312: worker is 1 (out of 1 available) 30529 1726882605.66324: exiting _queue_task() for managed_node1/package_facts 30529 1726882605.66336: done queuing things up, now waiting for results queue to drain 30529 1726882605.66338: waiting for pending results... 30529 1726882605.66699: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882605.66737: in run() - task 12673a56-9f93-b0f1-edc0-000000000792 30529 1726882605.66757: variable 'ansible_search_path' from source: unknown 30529 1726882605.66765: variable 'ansible_search_path' from source: unknown 30529 1726882605.66809: calling self._execute() 30529 1726882605.66910: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882605.66921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882605.66938: variable 'omit' from source: magic vars 30529 1726882605.67330: variable 'ansible_distribution_major_version' from source: facts 30529 1726882605.67363: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882605.67370: variable 'omit' from source: magic vars 30529 1726882605.67450: variable 'omit' from source: magic vars 30529 1726882605.67589: variable 'omit' from source: magic vars 30529 1726882605.67594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882605.67598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882605.67600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882605.67622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882605.67639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882605.67674: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882605.67683: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882605.67702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882605.67810: Set connection var ansible_shell_executable to /bin/sh 30529 1726882605.67821: Set connection var ansible_pipelining to False 30529 1726882605.67827: Set connection var ansible_shell_type to sh 30529 1726882605.67837: Set connection var ansible_timeout to 10 30529 1726882605.67842: Set connection var ansible_connection to ssh 30529 1726882605.67848: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882605.67869: variable 'ansible_shell_executable' from source: unknown 30529 1726882605.67908: variable 'ansible_connection' from source: unknown 30529 1726882605.68197: variable 'ansible_module_compression' from source: unknown 30529 1726882605.68200: variable 'ansible_shell_type' from source: unknown 30529 1726882605.68203: variable 'ansible_shell_executable' from source: unknown 30529 1726882605.68205: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882605.68207: variable 'ansible_pipelining' from source: unknown 30529 1726882605.68208: variable 'ansible_timeout' from source: unknown 30529 1726882605.68210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882605.68368: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882605.68442: variable 'omit' from source: magic vars 30529 1726882605.68451: starting attempt loop 30529 1726882605.68456: running the handler 30529 1726882605.68700: _low_level_execute_command(): starting 30529 1726882605.68704: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882605.69723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882605.69740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882605.69753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882605.69802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.70014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882605.70032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.70126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.71657: stdout chunk (state=3): >>>/root <<< 30529 1726882605.71758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882605.71795: stderr chunk (state=3): >>><<< 30529 1726882605.71805: stdout chunk (state=3): >>><<< 30529 1726882605.71875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882605.71910: _low_level_execute_command(): starting 30529 1726882605.72037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133 `" && echo ansible-tmp-1726882605.7188256-31439-279247721923133="` echo /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133 `" ) && sleep 0' 30529 1726882605.73301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.73411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882605.73415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.73470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.75316: stdout chunk (state=3): >>>ansible-tmp-1726882605.7188256-31439-279247721923133=/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133 <<< 30529 1726882605.75452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882605.75461: stdout chunk (state=3): >>><<< 30529 1726882605.75470: stderr chunk (state=3): >>><<< 30529 1726882605.75487: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882605.7188256-31439-279247721923133=/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882605.75700: variable 'ansible_module_compression' from source: unknown 30529 1726882605.75703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882605.75858: variable 'ansible_facts' from source: unknown 30529 1726882605.76252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py 30529 1726882605.76619: Sending initial data 30529 1726882605.76622: Sent initial data (162 bytes) 30529 1726882605.77526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882605.77532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.77535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882605.77537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.77580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.79090: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882605.79175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882605.79179: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpczzw7hz0 /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py <<< 30529 1726882605.79182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py" <<< 30529 1726882605.79403: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpczzw7hz0" to remote "/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py" <<< 30529 1726882605.80801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882605.80874: stderr chunk (state=3): >>><<< 30529 1726882605.80883: stdout chunk (state=3): >>><<< 30529 1726882605.80951: done transferring module to remote 30529 1726882605.80971: _low_level_execute_command(): starting 30529 1726882605.80980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/ /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py && sleep 0' 30529 1726882605.81606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882605.81622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882605.81679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882605.81785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.81809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882605.81826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.81908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882605.83623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882605.83631: stdout chunk (state=3): >>><<< 30529 1726882605.83640: stderr chunk (state=3): >>><<< 30529 1726882605.83874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882605.83878: _low_level_execute_command(): starting 30529 1726882605.83880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/AnsiballZ_package_facts.py && sleep 0' 30529 1726882605.85189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882605.85498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882605.85502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882605.85618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882606.28977: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882606.29051: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30529 1726882606.29124: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882606.29179: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882606.29214: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882606.30952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882606.30961: stdout chunk (state=3): >>><<< 30529 1726882606.30975: stderr chunk (state=3): >>><<< 30529 1726882606.31028: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882606.33411: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882606.33491: _low_level_execute_command(): starting 30529 1726882606.33496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882605.7188256-31439-279247721923133/ > /dev/null 2>&1 && sleep 0' 30529 1726882606.34089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882606.34107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882606.34123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882606.34154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882606.34255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882606.34306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882606.34354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882606.36167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882606.36197: stderr chunk (state=3): >>><<< 30529 1726882606.36200: stdout chunk (state=3): >>><<< 30529 1726882606.36213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882606.36219: handler run complete 30529 1726882606.36689: variable 'ansible_facts' from source: unknown 30529 1726882606.37005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.38557: variable 'ansible_facts' from source: unknown 30529 1726882606.38813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.39186: attempt loop complete, returning result 30529 1726882606.39203: _execute() done 30529 1726882606.39206: dumping result to json 30529 1726882606.39319: done dumping result, returning 30529 1726882606.39326: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000000792] 30529 1726882606.39330: sending task result for task 12673a56-9f93-b0f1-edc0-000000000792 30529 1726882606.40559: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000792 30529 1726882606.40562: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882606.40641: no more pending results, returning what we have 30529 1726882606.40643: results queue empty 30529 1726882606.40644: checking for any_errors_fatal 30529 1726882606.40647: done checking for any_errors_fatal 30529 1726882606.40647: checking for max_fail_percentage 30529 1726882606.40649: done checking for max_fail_percentage 30529 1726882606.40649: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.40650: done checking to see if all hosts have failed 30529 1726882606.40650: getting the remaining hosts for this loop 30529 1726882606.40651: done getting the remaining hosts for this loop 30529 1726882606.40654: getting the next task for host managed_node1 30529 1726882606.40659: done getting next task for host managed_node1 30529 1726882606.40662: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882606.40664: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.40671: getting variables 30529 1726882606.40672: in VariableManager get_vars() 30529 1726882606.40697: Calling all_inventory to load vars for managed_node1 30529 1726882606.40699: Calling groups_inventory to load vars for managed_node1 30529 1726882606.40700: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.40706: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.40708: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.40710: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.41372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.42330: done with get_vars() 30529 1726882606.42350: done getting variables 30529 1726882606.42420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:46 -0400 (0:00:00.766) 0:00:20.450 ****** 30529 1726882606.42458: entering _queue_task() for managed_node1/debug 30529 1726882606.42772: worker is 1 (out of 1 available) 30529 1726882606.42787: exiting _queue_task() for managed_node1/debug 30529 1726882606.43001: done queuing things up, now waiting for results queue to drain 30529 1726882606.43004: waiting for pending results... 30529 1726882606.43131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882606.43237: in run() - task 12673a56-9f93-b0f1-edc0-000000000730 30529 1726882606.43257: variable 'ansible_search_path' from source: unknown 30529 1726882606.43263: variable 'ansible_search_path' from source: unknown 30529 1726882606.43307: calling self._execute() 30529 1726882606.43402: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.43447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.43451: variable 'omit' from source: magic vars 30529 1726882606.43796: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.43809: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.43817: variable 'omit' from source: magic vars 30529 1726882606.43860: variable 'omit' from source: magic vars 30529 1726882606.43932: variable 'network_provider' from source: set_fact 30529 1726882606.43945: variable 'omit' from source: magic vars 30529 1726882606.43976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882606.44006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882606.44023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882606.44036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882606.44048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882606.44070: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882606.44073: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.44075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.44148: Set connection var ansible_shell_executable to /bin/sh 30529 1726882606.44153: Set connection var ansible_pipelining to False 30529 1726882606.44155: Set connection var ansible_shell_type to sh 30529 1726882606.44164: Set connection var ansible_timeout to 10 30529 1726882606.44166: Set connection var ansible_connection to ssh 30529 1726882606.44171: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882606.44190: variable 'ansible_shell_executable' from source: unknown 30529 1726882606.44195: variable 'ansible_connection' from source: unknown 30529 1726882606.44198: variable 'ansible_module_compression' from source: unknown 30529 1726882606.44200: variable 'ansible_shell_type' from source: unknown 30529 1726882606.44204: variable 'ansible_shell_executable' from source: unknown 30529 1726882606.44206: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.44208: variable 'ansible_pipelining' from source: unknown 30529 1726882606.44210: variable 'ansible_timeout' from source: unknown 30529 1726882606.44213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.44312: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882606.44323: variable 'omit' from source: magic vars 30529 1726882606.44326: starting attempt loop 30529 1726882606.44328: running the handler 30529 1726882606.44364: handler run complete 30529 1726882606.44375: attempt loop complete, returning result 30529 1726882606.44378: _execute() done 30529 1726882606.44381: dumping result to json 30529 1726882606.44383: done dumping result, returning 30529 1726882606.44394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000000730] 30529 1726882606.44397: sending task result for task 12673a56-9f93-b0f1-edc0-000000000730 30529 1726882606.44475: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000730 30529 1726882606.44478: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882606.44539: no more pending results, returning what we have 30529 1726882606.44542: results queue empty 30529 1726882606.44543: checking for any_errors_fatal 30529 1726882606.44552: done checking for any_errors_fatal 30529 1726882606.44552: checking for max_fail_percentage 30529 1726882606.44554: done checking for max_fail_percentage 30529 1726882606.44555: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.44556: done checking to see if all hosts have failed 30529 1726882606.44556: getting the remaining hosts for this loop 30529 1726882606.44558: done getting the remaining hosts for this loop 30529 1726882606.44562: getting the next task for host managed_node1 30529 1726882606.44568: done getting next task for host managed_node1 30529 1726882606.44571: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882606.44576: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.44588: getting variables 30529 1726882606.44590: in VariableManager get_vars() 30529 1726882606.44619: Calling all_inventory to load vars for managed_node1 30529 1726882606.44622: Calling groups_inventory to load vars for managed_node1 30529 1726882606.44624: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.44632: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.44634: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.44637: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.45619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.50778: done with get_vars() 30529 1726882606.50801: done getting variables 30529 1726882606.50837: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:46 -0400 (0:00:00.084) 0:00:20.534 ****** 30529 1726882606.50862: entering _queue_task() for managed_node1/fail 30529 1726882606.51116: worker is 1 (out of 1 available) 30529 1726882606.51131: exiting _queue_task() for managed_node1/fail 30529 1726882606.51144: done queuing things up, now waiting for results queue to drain 30529 1726882606.51145: waiting for pending results... 30529 1726882606.51323: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882606.51423: in run() - task 12673a56-9f93-b0f1-edc0-000000000731 30529 1726882606.51436: variable 'ansible_search_path' from source: unknown 30529 1726882606.51440: variable 'ansible_search_path' from source: unknown 30529 1726882606.51466: calling self._execute() 30529 1726882606.51535: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.51539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.51548: variable 'omit' from source: magic vars 30529 1726882606.51814: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.51823: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.51904: variable 'network_state' from source: role '' defaults 30529 1726882606.51914: Evaluated conditional (network_state != {}): False 30529 1726882606.51918: when evaluation is False, skipping this task 30529 1726882606.51921: _execute() done 30529 1726882606.51925: dumping result to json 30529 1726882606.51928: done dumping result, returning 30529 1726882606.51932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000000731] 30529 1726882606.51942: sending task result for task 12673a56-9f93-b0f1-edc0-000000000731 30529 1726882606.52026: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000731 30529 1726882606.52030: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882606.52089: no more pending results, returning what we have 30529 1726882606.52095: results queue empty 30529 1726882606.52096: checking for any_errors_fatal 30529 1726882606.52103: done checking for any_errors_fatal 30529 1726882606.52104: checking for max_fail_percentage 30529 1726882606.52105: done checking for max_fail_percentage 30529 1726882606.52106: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.52107: done checking to see if all hosts have failed 30529 1726882606.52108: getting the remaining hosts for this loop 30529 1726882606.52109: done getting the remaining hosts for this loop 30529 1726882606.52113: getting the next task for host managed_node1 30529 1726882606.52120: done getting next task for host managed_node1 30529 1726882606.52123: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882606.52127: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.52143: getting variables 30529 1726882606.52145: in VariableManager get_vars() 30529 1726882606.52173: Calling all_inventory to load vars for managed_node1 30529 1726882606.52176: Calling groups_inventory to load vars for managed_node1 30529 1726882606.52178: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.52189: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.52191: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.52195: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.53288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.54573: done with get_vars() 30529 1726882606.54601: done getting variables 30529 1726882606.54677: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:46 -0400 (0:00:00.038) 0:00:20.573 ****** 30529 1726882606.54725: entering _queue_task() for managed_node1/fail 30529 1726882606.55090: worker is 1 (out of 1 available) 30529 1726882606.55108: exiting _queue_task() for managed_node1/fail 30529 1726882606.55124: done queuing things up, now waiting for results queue to drain 30529 1726882606.55127: waiting for pending results... 30529 1726882606.55495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882606.55597: in run() - task 12673a56-9f93-b0f1-edc0-000000000732 30529 1726882606.55613: variable 'ansible_search_path' from source: unknown 30529 1726882606.55617: variable 'ansible_search_path' from source: unknown 30529 1726882606.55659: calling self._execute() 30529 1726882606.55726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.55733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.55772: variable 'omit' from source: magic vars 30529 1726882606.56188: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.56210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.56328: variable 'network_state' from source: role '' defaults 30529 1726882606.56332: Evaluated conditional (network_state != {}): False 30529 1726882606.56335: when evaluation is False, skipping this task 30529 1726882606.56339: _execute() done 30529 1726882606.56342: dumping result to json 30529 1726882606.56344: done dumping result, returning 30529 1726882606.56356: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000000732] 30529 1726882606.56359: sending task result for task 12673a56-9f93-b0f1-edc0-000000000732 30529 1726882606.56460: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000732 30529 1726882606.56463: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882606.56519: no more pending results, returning what we have 30529 1726882606.56523: results queue empty 30529 1726882606.56524: checking for any_errors_fatal 30529 1726882606.56529: done checking for any_errors_fatal 30529 1726882606.56530: checking for max_fail_percentage 30529 1726882606.56532: done checking for max_fail_percentage 30529 1726882606.56533: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.56534: done checking to see if all hosts have failed 30529 1726882606.56535: getting the remaining hosts for this loop 30529 1726882606.56536: done getting the remaining hosts for this loop 30529 1726882606.56540: getting the next task for host managed_node1 30529 1726882606.56547: done getting next task for host managed_node1 30529 1726882606.56550: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882606.56554: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.56569: getting variables 30529 1726882606.56571: in VariableManager get_vars() 30529 1726882606.56605: Calling all_inventory to load vars for managed_node1 30529 1726882606.56607: Calling groups_inventory to load vars for managed_node1 30529 1726882606.56609: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.56623: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.56626: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.56629: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.57627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.58518: done with get_vars() 30529 1726882606.58531: done getting variables 30529 1726882606.58571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:46 -0400 (0:00:00.038) 0:00:20.612 ****** 30529 1726882606.58597: entering _queue_task() for managed_node1/fail 30529 1726882606.58783: worker is 1 (out of 1 available) 30529 1726882606.58797: exiting _queue_task() for managed_node1/fail 30529 1726882606.58809: done queuing things up, now waiting for results queue to drain 30529 1726882606.58811: waiting for pending results... 30529 1726882606.58986: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882606.59078: in run() - task 12673a56-9f93-b0f1-edc0-000000000733 30529 1726882606.59110: variable 'ansible_search_path' from source: unknown 30529 1726882606.59114: variable 'ansible_search_path' from source: unknown 30529 1726882606.59133: calling self._execute() 30529 1726882606.59200: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.59203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.59213: variable 'omit' from source: magic vars 30529 1726882606.59468: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.59479: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.59595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882606.61533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882606.61624: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882606.61665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882606.61731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882606.61735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882606.61842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.61869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.61897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.61940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.61973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.62033: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.62048: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882606.62130: variable 'ansible_distribution' from source: facts 30529 1726882606.62134: variable '__network_rh_distros' from source: role '' defaults 30529 1726882606.62142: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882606.62414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.62433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.62450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.62475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.62487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.62545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.62568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.62616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.62638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.62660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.62687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.62707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.62737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.62764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.62775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.63043: variable 'network_connections' from source: include params 30529 1726882606.63051: variable 'interface' from source: play vars 30529 1726882606.63099: variable 'interface' from source: play vars 30529 1726882606.63123: variable 'network_state' from source: role '' defaults 30529 1726882606.63163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882606.63280: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882606.63310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882606.63333: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882606.63356: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882606.63387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882606.63403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882606.63425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.63442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882606.63471: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882606.63474: when evaluation is False, skipping this task 30529 1726882606.63477: _execute() done 30529 1726882606.63479: dumping result to json 30529 1726882606.63481: done dumping result, returning 30529 1726882606.63490: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000000733] 30529 1726882606.63492: sending task result for task 12673a56-9f93-b0f1-edc0-000000000733 30529 1726882606.63570: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000733 30529 1726882606.63572: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882606.63618: no more pending results, returning what we have 30529 1726882606.63621: results queue empty 30529 1726882606.63622: checking for any_errors_fatal 30529 1726882606.63632: done checking for any_errors_fatal 30529 1726882606.63633: checking for max_fail_percentage 30529 1726882606.63635: done checking for max_fail_percentage 30529 1726882606.63636: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.63636: done checking to see if all hosts have failed 30529 1726882606.63637: getting the remaining hosts for this loop 30529 1726882606.63639: done getting the remaining hosts for this loop 30529 1726882606.63642: getting the next task for host managed_node1 30529 1726882606.63648: done getting next task for host managed_node1 30529 1726882606.63652: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882606.63656: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.63670: getting variables 30529 1726882606.63672: in VariableManager get_vars() 30529 1726882606.63704: Calling all_inventory to load vars for managed_node1 30529 1726882606.63707: Calling groups_inventory to load vars for managed_node1 30529 1726882606.63709: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.63717: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.63720: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.63722: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.64626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.65550: done with get_vars() 30529 1726882606.65566: done getting variables 30529 1726882606.65608: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:46 -0400 (0:00:00.070) 0:00:20.682 ****** 30529 1726882606.65631: entering _queue_task() for managed_node1/dnf 30529 1726882606.65859: worker is 1 (out of 1 available) 30529 1726882606.65871: exiting _queue_task() for managed_node1/dnf 30529 1726882606.65883: done queuing things up, now waiting for results queue to drain 30529 1726882606.65885: waiting for pending results... 30529 1726882606.66188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882606.66257: in run() - task 12673a56-9f93-b0f1-edc0-000000000734 30529 1726882606.66269: variable 'ansible_search_path' from source: unknown 30529 1726882606.66271: variable 'ansible_search_path' from source: unknown 30529 1726882606.66305: calling self._execute() 30529 1726882606.66371: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.66375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.66385: variable 'omit' from source: magic vars 30529 1726882606.66671: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.66708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.66871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882606.69156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882606.69211: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882606.69343: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882606.69347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882606.69400: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882606.69461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.69505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.69532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.69577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.69649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.69800: variable 'ansible_distribution' from source: facts 30529 1726882606.69806: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.69809: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882606.69904: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882606.70182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.70189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.70192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.70196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.70199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.70227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.70237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.70253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.70320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.70323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.70348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.70372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.70461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.70475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.70479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.70663: variable 'network_connections' from source: include params 30529 1726882606.70667: variable 'interface' from source: play vars 30529 1726882606.70698: variable 'interface' from source: play vars 30529 1726882606.70761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882606.70911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882606.70958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882606.70999: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882606.71019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882606.71049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882606.71064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882606.71098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.71110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882606.71153: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882606.71305: variable 'network_connections' from source: include params 30529 1726882606.71309: variable 'interface' from source: play vars 30529 1726882606.71349: variable 'interface' from source: play vars 30529 1726882606.71372: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882606.71375: when evaluation is False, skipping this task 30529 1726882606.71378: _execute() done 30529 1726882606.71380: dumping result to json 30529 1726882606.71382: done dumping result, returning 30529 1726882606.71394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000734] 30529 1726882606.71397: sending task result for task 12673a56-9f93-b0f1-edc0-000000000734 30529 1726882606.71487: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000734 30529 1726882606.71490: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882606.71561: no more pending results, returning what we have 30529 1726882606.71564: results queue empty 30529 1726882606.71565: checking for any_errors_fatal 30529 1726882606.71571: done checking for any_errors_fatal 30529 1726882606.71572: checking for max_fail_percentage 30529 1726882606.71573: done checking for max_fail_percentage 30529 1726882606.71574: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.71576: done checking to see if all hosts have failed 30529 1726882606.71576: getting the remaining hosts for this loop 30529 1726882606.71578: done getting the remaining hosts for this loop 30529 1726882606.71581: getting the next task for host managed_node1 30529 1726882606.71590: done getting next task for host managed_node1 30529 1726882606.71595: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882606.71600: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.71616: getting variables 30529 1726882606.71618: in VariableManager get_vars() 30529 1726882606.71654: Calling all_inventory to load vars for managed_node1 30529 1726882606.71657: Calling groups_inventory to load vars for managed_node1 30529 1726882606.71659: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.71668: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.71671: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.71674: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.72894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.73983: done with get_vars() 30529 1726882606.74000: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882606.74067: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:46 -0400 (0:00:00.084) 0:00:20.767 ****** 30529 1726882606.74092: entering _queue_task() for managed_node1/yum 30529 1726882606.74342: worker is 1 (out of 1 available) 30529 1726882606.74357: exiting _queue_task() for managed_node1/yum 30529 1726882606.74370: done queuing things up, now waiting for results queue to drain 30529 1726882606.74371: waiting for pending results... 30529 1726882606.74658: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882606.74746: in run() - task 12673a56-9f93-b0f1-edc0-000000000735 30529 1726882606.74757: variable 'ansible_search_path' from source: unknown 30529 1726882606.74760: variable 'ansible_search_path' from source: unknown 30529 1726882606.74790: calling self._execute() 30529 1726882606.74875: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.74879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.74887: variable 'omit' from source: magic vars 30529 1726882606.75195: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.75231: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.75400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882606.76970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882606.77070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882606.77078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882606.77106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882606.77129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882606.77195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.77216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.77234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.77281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.77295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.77363: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.77389: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882606.77392: when evaluation is False, skipping this task 30529 1726882606.77397: _execute() done 30529 1726882606.77399: dumping result to json 30529 1726882606.77401: done dumping result, returning 30529 1726882606.77404: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000735] 30529 1726882606.77407: sending task result for task 12673a56-9f93-b0f1-edc0-000000000735 30529 1726882606.77514: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000735 30529 1726882606.77516: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882606.77573: no more pending results, returning what we have 30529 1726882606.77576: results queue empty 30529 1726882606.77577: checking for any_errors_fatal 30529 1726882606.77582: done checking for any_errors_fatal 30529 1726882606.77583: checking for max_fail_percentage 30529 1726882606.77585: done checking for max_fail_percentage 30529 1726882606.77588: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.77589: done checking to see if all hosts have failed 30529 1726882606.77590: getting the remaining hosts for this loop 30529 1726882606.77591: done getting the remaining hosts for this loop 30529 1726882606.77596: getting the next task for host managed_node1 30529 1726882606.77603: done getting next task for host managed_node1 30529 1726882606.77607: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882606.77614: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.77629: getting variables 30529 1726882606.77631: in VariableManager get_vars() 30529 1726882606.77659: Calling all_inventory to load vars for managed_node1 30529 1726882606.77661: Calling groups_inventory to load vars for managed_node1 30529 1726882606.77663: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.77671: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.77674: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.77677: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.78661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.79876: done with get_vars() 30529 1726882606.79902: done getting variables 30529 1726882606.79970: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:46 -0400 (0:00:00.059) 0:00:20.826 ****** 30529 1726882606.80010: entering _queue_task() for managed_node1/fail 30529 1726882606.80270: worker is 1 (out of 1 available) 30529 1726882606.80283: exiting _queue_task() for managed_node1/fail 30529 1726882606.80302: done queuing things up, now waiting for results queue to drain 30529 1726882606.80306: waiting for pending results... 30529 1726882606.80552: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882606.80657: in run() - task 12673a56-9f93-b0f1-edc0-000000000736 30529 1726882606.80700: variable 'ansible_search_path' from source: unknown 30529 1726882606.80704: variable 'ansible_search_path' from source: unknown 30529 1726882606.80759: calling self._execute() 30529 1726882606.80809: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.80814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.80823: variable 'omit' from source: magic vars 30529 1726882606.81099: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.81154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.81260: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882606.81433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882606.83024: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882606.83073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882606.83105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882606.83131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882606.83150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882606.83213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.83235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.83252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.83279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.83294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.83398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.83402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.83404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.83406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.83408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.83417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.83438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.83454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.83478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.83489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.83609: variable 'network_connections' from source: include params 30529 1726882606.83619: variable 'interface' from source: play vars 30529 1726882606.83666: variable 'interface' from source: play vars 30529 1726882606.83717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882606.83826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882606.83866: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882606.83886: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882606.83911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882606.83940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882606.83955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882606.83974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.83998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882606.84042: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882606.84190: variable 'network_connections' from source: include params 30529 1726882606.84201: variable 'interface' from source: play vars 30529 1726882606.84243: variable 'interface' from source: play vars 30529 1726882606.84266: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882606.84270: when evaluation is False, skipping this task 30529 1726882606.84272: _execute() done 30529 1726882606.84275: dumping result to json 30529 1726882606.84277: done dumping result, returning 30529 1726882606.84283: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000736] 30529 1726882606.84290: sending task result for task 12673a56-9f93-b0f1-edc0-000000000736 30529 1726882606.84374: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000736 30529 1726882606.84377: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882606.84437: no more pending results, returning what we have 30529 1726882606.84440: results queue empty 30529 1726882606.84441: checking for any_errors_fatal 30529 1726882606.84447: done checking for any_errors_fatal 30529 1726882606.84447: checking for max_fail_percentage 30529 1726882606.84449: done checking for max_fail_percentage 30529 1726882606.84450: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.84451: done checking to see if all hosts have failed 30529 1726882606.84451: getting the remaining hosts for this loop 30529 1726882606.84453: done getting the remaining hosts for this loop 30529 1726882606.84456: getting the next task for host managed_node1 30529 1726882606.84465: done getting next task for host managed_node1 30529 1726882606.84468: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882606.84472: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.84490: getting variables 30529 1726882606.84491: in VariableManager get_vars() 30529 1726882606.84533: Calling all_inventory to load vars for managed_node1 30529 1726882606.84536: Calling groups_inventory to load vars for managed_node1 30529 1726882606.84538: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.84546: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.84549: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.84551: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.85309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.86170: done with get_vars() 30529 1726882606.86185: done getting variables 30529 1726882606.86228: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:46 -0400 (0:00:00.062) 0:00:20.888 ****** 30529 1726882606.86254: entering _queue_task() for managed_node1/package 30529 1726882606.86475: worker is 1 (out of 1 available) 30529 1726882606.86487: exiting _queue_task() for managed_node1/package 30529 1726882606.86500: done queuing things up, now waiting for results queue to drain 30529 1726882606.86502: waiting for pending results... 30529 1726882606.86680: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882606.86771: in run() - task 12673a56-9f93-b0f1-edc0-000000000737 30529 1726882606.86782: variable 'ansible_search_path' from source: unknown 30529 1726882606.86785: variable 'ansible_search_path' from source: unknown 30529 1726882606.86817: calling self._execute() 30529 1726882606.86887: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.86895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.86904: variable 'omit' from source: magic vars 30529 1726882606.87168: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.87182: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.87315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882606.87501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882606.87532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882606.87557: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882606.87611: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882606.87680: variable 'network_packages' from source: role '' defaults 30529 1726882606.87756: variable '__network_provider_setup' from source: role '' defaults 30529 1726882606.87764: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882606.87811: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882606.87819: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882606.87862: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882606.87977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882606.89252: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882606.89296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882606.89324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882606.89348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882606.89369: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882606.89656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.89679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.89699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.89725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.89736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.89767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.89789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.89807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.89832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.89842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.89974: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882606.90047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.90064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.90080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.90110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.90122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.90179: variable 'ansible_python' from source: facts 30529 1726882606.90192: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882606.90250: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882606.90304: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882606.90384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.90404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.90423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.90451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.90461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.90492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882606.90513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882606.90531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.90558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882606.90568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882606.90662: variable 'network_connections' from source: include params 30529 1726882606.90666: variable 'interface' from source: play vars 30529 1726882606.90736: variable 'interface' from source: play vars 30529 1726882606.90789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882606.90809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882606.90831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882606.90852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882606.90890: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882606.91062: variable 'network_connections' from source: include params 30529 1726882606.91065: variable 'interface' from source: play vars 30529 1726882606.91139: variable 'interface' from source: play vars 30529 1726882606.91174: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882606.91232: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882606.91421: variable 'network_connections' from source: include params 30529 1726882606.91425: variable 'interface' from source: play vars 30529 1726882606.91470: variable 'interface' from source: play vars 30529 1726882606.91490: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882606.91545: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882606.91735: variable 'network_connections' from source: include params 30529 1726882606.91739: variable 'interface' from source: play vars 30529 1726882606.91783: variable 'interface' from source: play vars 30529 1726882606.91826: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882606.91870: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882606.91875: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882606.91919: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882606.92053: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882606.92343: variable 'network_connections' from source: include params 30529 1726882606.92346: variable 'interface' from source: play vars 30529 1726882606.92391: variable 'interface' from source: play vars 30529 1726882606.92399: variable 'ansible_distribution' from source: facts 30529 1726882606.92402: variable '__network_rh_distros' from source: role '' defaults 30529 1726882606.92408: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.92430: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882606.92535: variable 'ansible_distribution' from source: facts 30529 1726882606.92538: variable '__network_rh_distros' from source: role '' defaults 30529 1726882606.92541: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.92550: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882606.92655: variable 'ansible_distribution' from source: facts 30529 1726882606.92659: variable '__network_rh_distros' from source: role '' defaults 30529 1726882606.92661: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.92688: variable 'network_provider' from source: set_fact 30529 1726882606.92702: variable 'ansible_facts' from source: unknown 30529 1726882606.93047: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882606.93050: when evaluation is False, skipping this task 30529 1726882606.93053: _execute() done 30529 1726882606.93055: dumping result to json 30529 1726882606.93057: done dumping result, returning 30529 1726882606.93065: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000000737] 30529 1726882606.93069: sending task result for task 12673a56-9f93-b0f1-edc0-000000000737 30529 1726882606.93196: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000737 30529 1726882606.93199: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882606.93280: no more pending results, returning what we have 30529 1726882606.93283: results queue empty 30529 1726882606.93284: checking for any_errors_fatal 30529 1726882606.93291: done checking for any_errors_fatal 30529 1726882606.93292: checking for max_fail_percentage 30529 1726882606.93296: done checking for max_fail_percentage 30529 1726882606.93297: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.93298: done checking to see if all hosts have failed 30529 1726882606.93299: getting the remaining hosts for this loop 30529 1726882606.93300: done getting the remaining hosts for this loop 30529 1726882606.93304: getting the next task for host managed_node1 30529 1726882606.93311: done getting next task for host managed_node1 30529 1726882606.93315: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882606.93319: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.93334: getting variables 30529 1726882606.93335: in VariableManager get_vars() 30529 1726882606.93368: Calling all_inventory to load vars for managed_node1 30529 1726882606.93370: Calling groups_inventory to load vars for managed_node1 30529 1726882606.93372: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.93381: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.93383: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.93388: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.94295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.95218: done with get_vars() 30529 1726882606.95246: done getting variables 30529 1726882606.95328: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:46 -0400 (0:00:00.090) 0:00:20.979 ****** 30529 1726882606.95354: entering _queue_task() for managed_node1/package 30529 1726882606.95599: worker is 1 (out of 1 available) 30529 1726882606.95613: exiting _queue_task() for managed_node1/package 30529 1726882606.95626: done queuing things up, now waiting for results queue to drain 30529 1726882606.95627: waiting for pending results... 30529 1726882606.95828: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882606.95924: in run() - task 12673a56-9f93-b0f1-edc0-000000000738 30529 1726882606.95938: variable 'ansible_search_path' from source: unknown 30529 1726882606.95943: variable 'ansible_search_path' from source: unknown 30529 1726882606.95969: calling self._execute() 30529 1726882606.96040: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.96045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.96053: variable 'omit' from source: magic vars 30529 1726882606.96316: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.96326: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.96409: variable 'network_state' from source: role '' defaults 30529 1726882606.96418: Evaluated conditional (network_state != {}): False 30529 1726882606.96421: when evaluation is False, skipping this task 30529 1726882606.96424: _execute() done 30529 1726882606.96426: dumping result to json 30529 1726882606.96429: done dumping result, returning 30529 1726882606.96437: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000738] 30529 1726882606.96440: sending task result for task 12673a56-9f93-b0f1-edc0-000000000738 30529 1726882606.96529: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000738 30529 1726882606.96532: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882606.96583: no more pending results, returning what we have 30529 1726882606.96589: results queue empty 30529 1726882606.96590: checking for any_errors_fatal 30529 1726882606.96598: done checking for any_errors_fatal 30529 1726882606.96598: checking for max_fail_percentage 30529 1726882606.96600: done checking for max_fail_percentage 30529 1726882606.96601: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.96601: done checking to see if all hosts have failed 30529 1726882606.96602: getting the remaining hosts for this loop 30529 1726882606.96604: done getting the remaining hosts for this loop 30529 1726882606.96607: getting the next task for host managed_node1 30529 1726882606.96614: done getting next task for host managed_node1 30529 1726882606.96617: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882606.96622: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.96638: getting variables 30529 1726882606.96639: in VariableManager get_vars() 30529 1726882606.96671: Calling all_inventory to load vars for managed_node1 30529 1726882606.96674: Calling groups_inventory to load vars for managed_node1 30529 1726882606.96676: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.96684: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.96689: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.96692: Calling groups_plugins_play to load vars for managed_node1 30529 1726882606.97441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882606.98319: done with get_vars() 30529 1726882606.98333: done getting variables 30529 1726882606.98374: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:46 -0400 (0:00:00.030) 0:00:21.010 ****** 30529 1726882606.98402: entering _queue_task() for managed_node1/package 30529 1726882606.98634: worker is 1 (out of 1 available) 30529 1726882606.98646: exiting _queue_task() for managed_node1/package 30529 1726882606.98659: done queuing things up, now waiting for results queue to drain 30529 1726882606.98661: waiting for pending results... 30529 1726882606.98839: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882606.98929: in run() - task 12673a56-9f93-b0f1-edc0-000000000739 30529 1726882606.98940: variable 'ansible_search_path' from source: unknown 30529 1726882606.98943: variable 'ansible_search_path' from source: unknown 30529 1726882606.98969: calling self._execute() 30529 1726882606.99038: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882606.99042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882606.99051: variable 'omit' from source: magic vars 30529 1726882606.99312: variable 'ansible_distribution_major_version' from source: facts 30529 1726882606.99327: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882606.99406: variable 'network_state' from source: role '' defaults 30529 1726882606.99416: Evaluated conditional (network_state != {}): False 30529 1726882606.99420: when evaluation is False, skipping this task 30529 1726882606.99422: _execute() done 30529 1726882606.99425: dumping result to json 30529 1726882606.99430: done dumping result, returning 30529 1726882606.99440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000739] 30529 1726882606.99443: sending task result for task 12673a56-9f93-b0f1-edc0-000000000739 30529 1726882606.99532: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000739 30529 1726882606.99535: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882606.99583: no more pending results, returning what we have 30529 1726882606.99590: results queue empty 30529 1726882606.99591: checking for any_errors_fatal 30529 1726882606.99604: done checking for any_errors_fatal 30529 1726882606.99605: checking for max_fail_percentage 30529 1726882606.99607: done checking for max_fail_percentage 30529 1726882606.99608: checking to see if all hosts have failed and the running result is not ok 30529 1726882606.99608: done checking to see if all hosts have failed 30529 1726882606.99609: getting the remaining hosts for this loop 30529 1726882606.99611: done getting the remaining hosts for this loop 30529 1726882606.99615: getting the next task for host managed_node1 30529 1726882606.99622: done getting next task for host managed_node1 30529 1726882606.99625: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882606.99630: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882606.99647: getting variables 30529 1726882606.99648: in VariableManager get_vars() 30529 1726882606.99675: Calling all_inventory to load vars for managed_node1 30529 1726882606.99677: Calling groups_inventory to load vars for managed_node1 30529 1726882606.99679: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882606.99690: Calling all_plugins_play to load vars for managed_node1 30529 1726882606.99694: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882606.99698: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.00557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.01421: done with get_vars() 30529 1726882607.01435: done getting variables 30529 1726882607.01477: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:47 -0400 (0:00:00.031) 0:00:21.041 ****** 30529 1726882607.01505: entering _queue_task() for managed_node1/service 30529 1726882607.01723: worker is 1 (out of 1 available) 30529 1726882607.01735: exiting _queue_task() for managed_node1/service 30529 1726882607.01749: done queuing things up, now waiting for results queue to drain 30529 1726882607.01750: waiting for pending results... 30529 1726882607.01924: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882607.02010: in run() - task 12673a56-9f93-b0f1-edc0-00000000073a 30529 1726882607.02023: variable 'ansible_search_path' from source: unknown 30529 1726882607.02026: variable 'ansible_search_path' from source: unknown 30529 1726882607.02052: calling self._execute() 30529 1726882607.02124: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.02128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.02136: variable 'omit' from source: magic vars 30529 1726882607.02397: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.02408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.02490: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882607.02617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882607.04065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882607.04118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882607.04148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882607.04171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882607.04192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882607.04252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.04275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.04297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.04323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.04333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.04369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.04385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.04404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.04429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.04440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.04467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.04484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.04509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.04532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.04542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.04662: variable 'network_connections' from source: include params 30529 1726882607.04671: variable 'interface' from source: play vars 30529 1726882607.04724: variable 'interface' from source: play vars 30529 1726882607.04769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882607.04878: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882607.04919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882607.04942: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882607.04964: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882607.04997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882607.05015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882607.05035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.05052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882607.05098: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882607.05253: variable 'network_connections' from source: include params 30529 1726882607.05256: variable 'interface' from source: play vars 30529 1726882607.05301: variable 'interface' from source: play vars 30529 1726882607.05326: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882607.05329: when evaluation is False, skipping this task 30529 1726882607.05332: _execute() done 30529 1726882607.05336: dumping result to json 30529 1726882607.05339: done dumping result, returning 30529 1726882607.05346: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000073a] 30529 1726882607.05348: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073a 30529 1726882607.05431: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073a 30529 1726882607.05441: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882607.05490: no more pending results, returning what we have 30529 1726882607.05494: results queue empty 30529 1726882607.05495: checking for any_errors_fatal 30529 1726882607.05503: done checking for any_errors_fatal 30529 1726882607.05504: checking for max_fail_percentage 30529 1726882607.05505: done checking for max_fail_percentage 30529 1726882607.05506: checking to see if all hosts have failed and the running result is not ok 30529 1726882607.05507: done checking to see if all hosts have failed 30529 1726882607.05508: getting the remaining hosts for this loop 30529 1726882607.05509: done getting the remaining hosts for this loop 30529 1726882607.05513: getting the next task for host managed_node1 30529 1726882607.05521: done getting next task for host managed_node1 30529 1726882607.05525: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882607.05530: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882607.05547: getting variables 30529 1726882607.05548: in VariableManager get_vars() 30529 1726882607.05580: Calling all_inventory to load vars for managed_node1 30529 1726882607.05582: Calling groups_inventory to load vars for managed_node1 30529 1726882607.05584: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882607.05595: Calling all_plugins_play to load vars for managed_node1 30529 1726882607.05598: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882607.05600: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.06370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.07228: done with get_vars() 30529 1726882607.07243: done getting variables 30529 1726882607.07286: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:47 -0400 (0:00:00.058) 0:00:21.099 ****** 30529 1726882607.07311: entering _queue_task() for managed_node1/service 30529 1726882607.07530: worker is 1 (out of 1 available) 30529 1726882607.07542: exiting _queue_task() for managed_node1/service 30529 1726882607.07555: done queuing things up, now waiting for results queue to drain 30529 1726882607.07557: waiting for pending results... 30529 1726882607.07734: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882607.07822: in run() - task 12673a56-9f93-b0f1-edc0-00000000073b 30529 1726882607.07834: variable 'ansible_search_path' from source: unknown 30529 1726882607.07838: variable 'ansible_search_path' from source: unknown 30529 1726882607.07863: calling self._execute() 30529 1726882607.07933: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.07938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.07947: variable 'omit' from source: magic vars 30529 1726882607.08205: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.08222: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.08328: variable 'network_provider' from source: set_fact 30529 1726882607.08332: variable 'network_state' from source: role '' defaults 30529 1726882607.08336: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882607.08344: variable 'omit' from source: magic vars 30529 1726882607.08382: variable 'omit' from source: magic vars 30529 1726882607.08405: variable 'network_service_name' from source: role '' defaults 30529 1726882607.08454: variable 'network_service_name' from source: role '' defaults 30529 1726882607.08526: variable '__network_provider_setup' from source: role '' defaults 30529 1726882607.08530: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882607.08577: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882607.08584: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882607.08632: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882607.08777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882607.10385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882607.10439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882607.10465: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882607.10492: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882607.10517: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882607.10570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.10595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.10617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.10642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.10653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.10683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.10704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.10723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.10748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.10758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.10902: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882607.10974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.10995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.11013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.11036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.11048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.11114: variable 'ansible_python' from source: facts 30529 1726882607.11125: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882607.11180: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882607.11236: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882607.11321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.11337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.11353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.11381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.11395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.11427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.11446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.11462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.11492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.11505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.11595: variable 'network_connections' from source: include params 30529 1726882607.11602: variable 'interface' from source: play vars 30529 1726882607.11653: variable 'interface' from source: play vars 30529 1726882607.11728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882607.11853: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882607.11886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882607.11923: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882607.11952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882607.11995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882607.12017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882607.12043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.12065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882607.12103: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882607.12272: variable 'network_connections' from source: include params 30529 1726882607.12278: variable 'interface' from source: play vars 30529 1726882607.12333: variable 'interface' from source: play vars 30529 1726882607.12367: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882607.12424: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882607.12608: variable 'network_connections' from source: include params 30529 1726882607.12611: variable 'interface' from source: play vars 30529 1726882607.12659: variable 'interface' from source: play vars 30529 1726882607.12682: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882607.12734: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882607.12915: variable 'network_connections' from source: include params 30529 1726882607.12918: variable 'interface' from source: play vars 30529 1726882607.12966: variable 'interface' from source: play vars 30529 1726882607.13014: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882607.13054: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882607.13059: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882607.13104: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882607.13238: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882607.13533: variable 'network_connections' from source: include params 30529 1726882607.13536: variable 'interface' from source: play vars 30529 1726882607.13580: variable 'interface' from source: play vars 30529 1726882607.13588: variable 'ansible_distribution' from source: facts 30529 1726882607.13595: variable '__network_rh_distros' from source: role '' defaults 30529 1726882607.13601: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.13624: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882607.13734: variable 'ansible_distribution' from source: facts 30529 1726882607.13737: variable '__network_rh_distros' from source: role '' defaults 30529 1726882607.13742: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.13749: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882607.13859: variable 'ansible_distribution' from source: facts 30529 1726882607.13862: variable '__network_rh_distros' from source: role '' defaults 30529 1726882607.13865: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.13898: variable 'network_provider' from source: set_fact 30529 1726882607.13913: variable 'omit' from source: magic vars 30529 1726882607.13932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882607.13951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882607.13965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882607.13977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882607.13992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882607.14013: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882607.14016: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.14019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.14083: Set connection var ansible_shell_executable to /bin/sh 30529 1726882607.14086: Set connection var ansible_pipelining to False 30529 1726882607.14097: Set connection var ansible_shell_type to sh 30529 1726882607.14105: Set connection var ansible_timeout to 10 30529 1726882607.14108: Set connection var ansible_connection to ssh 30529 1726882607.14110: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882607.14128: variable 'ansible_shell_executable' from source: unknown 30529 1726882607.14131: variable 'ansible_connection' from source: unknown 30529 1726882607.14133: variable 'ansible_module_compression' from source: unknown 30529 1726882607.14135: variable 'ansible_shell_type' from source: unknown 30529 1726882607.14138: variable 'ansible_shell_executable' from source: unknown 30529 1726882607.14140: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.14144: variable 'ansible_pipelining' from source: unknown 30529 1726882607.14146: variable 'ansible_timeout' from source: unknown 30529 1726882607.14150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.14223: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882607.14231: variable 'omit' from source: magic vars 30529 1726882607.14237: starting attempt loop 30529 1726882607.14239: running the handler 30529 1726882607.14295: variable 'ansible_facts' from source: unknown 30529 1726882607.14673: _low_level_execute_command(): starting 30529 1726882607.14679: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882607.15167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.15171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.15173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.15176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.15233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882607.15236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.15238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.15302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.16982: stdout chunk (state=3): >>>/root <<< 30529 1726882607.17084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.17117: stderr chunk (state=3): >>><<< 30529 1726882607.17121: stdout chunk (state=3): >>><<< 30529 1726882607.17138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.17147: _low_level_execute_command(): starting 30529 1726882607.17152: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352 `" && echo ansible-tmp-1726882607.1713712-31487-231128002920352="` echo /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352 `" ) && sleep 0' 30529 1726882607.17584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.17590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.17592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882607.17596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882607.17598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.17648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882607.17652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.17657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.17697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.19566: stdout chunk (state=3): >>>ansible-tmp-1726882607.1713712-31487-231128002920352=/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352 <<< 30529 1726882607.19669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.19697: stderr chunk (state=3): >>><<< 30529 1726882607.19700: stdout chunk (state=3): >>><<< 30529 1726882607.19714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882607.1713712-31487-231128002920352=/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.19738: variable 'ansible_module_compression' from source: unknown 30529 1726882607.19775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882607.19828: variable 'ansible_facts' from source: unknown 30529 1726882607.19961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py 30529 1726882607.20113: Sending initial data 30529 1726882607.20116: Sent initial data (156 bytes) 30529 1726882607.20805: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882607.20817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882607.20820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.20894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.20919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.21036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.22515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882607.22519: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882607.22610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882607.22630: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp54eddrvn /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py <<< 30529 1726882607.22633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py" <<< 30529 1726882607.22662: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp54eddrvn" to remote "/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py" <<< 30529 1726882607.24432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.24436: stderr chunk (state=3): >>><<< 30529 1726882607.24438: stdout chunk (state=3): >>><<< 30529 1726882607.24440: done transferring module to remote 30529 1726882607.24442: _low_level_execute_command(): starting 30529 1726882607.24444: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/ /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py && sleep 0' 30529 1726882607.24972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882607.24985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882607.25002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.25019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882607.25042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882607.25140: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.25176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.25215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.26915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.26939: stderr chunk (state=3): >>><<< 30529 1726882607.26942: stdout chunk (state=3): >>><<< 30529 1726882607.26956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.26959: _low_level_execute_command(): starting 30529 1726882607.26963: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/AnsiballZ_systemd.py && sleep 0' 30529 1726882607.27524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.27561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882607.27564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.27589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.27664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.56357: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315687424", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1724850000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882607.58221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882607.58227: stderr chunk (state=3): >>><<< 30529 1726882607.58230: stdout chunk (state=3): >>><<< 30529 1726882607.58268: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10833920", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315687424", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1724850000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882607.58484: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882607.58494: _low_level_execute_command(): starting 30529 1726882607.58538: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882607.1713712-31487-231128002920352/ > /dev/null 2>&1 && sleep 0' 30529 1726882607.59577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.59742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.59769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.59809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.61674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.61682: stdout chunk (state=3): >>><<< 30529 1726882607.61685: stderr chunk (state=3): >>><<< 30529 1726882607.61750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.61753: handler run complete 30529 1726882607.61816: attempt loop complete, returning result 30529 1726882607.61819: _execute() done 30529 1726882607.61821: dumping result to json 30529 1726882607.61823: done dumping result, returning 30529 1726882607.61825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-00000000073b] 30529 1726882607.61827: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073b 30529 1726882607.62843: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073b 30529 1726882607.62846: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882607.62966: no more pending results, returning what we have 30529 1726882607.62970: results queue empty 30529 1726882607.62971: checking for any_errors_fatal 30529 1726882607.62973: done checking for any_errors_fatal 30529 1726882607.62974: checking for max_fail_percentage 30529 1726882607.62976: done checking for max_fail_percentage 30529 1726882607.62976: checking to see if all hosts have failed and the running result is not ok 30529 1726882607.62977: done checking to see if all hosts have failed 30529 1726882607.62978: getting the remaining hosts for this loop 30529 1726882607.62979: done getting the remaining hosts for this loop 30529 1726882607.62982: getting the next task for host managed_node1 30529 1726882607.62988: done getting next task for host managed_node1 30529 1726882607.62992: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882607.62999: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882607.63033: getting variables 30529 1726882607.63036: in VariableManager get_vars() 30529 1726882607.63062: Calling all_inventory to load vars for managed_node1 30529 1726882607.63065: Calling groups_inventory to load vars for managed_node1 30529 1726882607.63067: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882607.63076: Calling all_plugins_play to load vars for managed_node1 30529 1726882607.63158: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882607.63215: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.64165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.65537: done with get_vars() 30529 1726882607.65562: done getting variables 30529 1726882607.65639: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:47 -0400 (0:00:00.583) 0:00:21.682 ****** 30529 1726882607.65688: entering _queue_task() for managed_node1/service 30529 1726882607.66074: worker is 1 (out of 1 available) 30529 1726882607.66091: exiting _queue_task() for managed_node1/service 30529 1726882607.66105: done queuing things up, now waiting for results queue to drain 30529 1726882607.66106: waiting for pending results... 30529 1726882607.66516: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882607.66598: in run() - task 12673a56-9f93-b0f1-edc0-00000000073c 30529 1726882607.66624: variable 'ansible_search_path' from source: unknown 30529 1726882607.66636: variable 'ansible_search_path' from source: unknown 30529 1726882607.66682: calling self._execute() 30529 1726882607.66805: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.66822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.66850: variable 'omit' from source: magic vars 30529 1726882607.67379: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.67383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.67478: variable 'network_provider' from source: set_fact 30529 1726882607.67538: Evaluated conditional (network_provider == "nm"): True 30529 1726882607.67603: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882607.67768: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882607.67939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882607.70026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882607.70090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882607.70146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882607.70173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882607.70191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882607.70433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.70465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.70485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.70537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.70550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.70610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.70639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.70647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.70699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.70703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.70729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.70760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.70776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.70815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.70826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.70956: variable 'network_connections' from source: include params 30529 1726882607.70959: variable 'interface' from source: play vars 30529 1726882607.71038: variable 'interface' from source: play vars 30529 1726882607.71106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882607.71266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882607.71498: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882607.71501: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882607.71503: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882607.71505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882607.71507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882607.71510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.71512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882607.71558: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882607.72044: variable 'network_connections' from source: include params 30529 1726882607.72054: variable 'interface' from source: play vars 30529 1726882607.72120: variable 'interface' from source: play vars 30529 1726882607.72322: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882607.72324: when evaluation is False, skipping this task 30529 1726882607.72326: _execute() done 30529 1726882607.72329: dumping result to json 30529 1726882607.72331: done dumping result, returning 30529 1726882607.72333: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-00000000073c] 30529 1726882607.72343: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073c 30529 1726882607.72426: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073c 30529 1726882607.72430: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882607.72513: no more pending results, returning what we have 30529 1726882607.72521: results queue empty 30529 1726882607.72523: checking for any_errors_fatal 30529 1726882607.72549: done checking for any_errors_fatal 30529 1726882607.72550: checking for max_fail_percentage 30529 1726882607.72552: done checking for max_fail_percentage 30529 1726882607.72553: checking to see if all hosts have failed and the running result is not ok 30529 1726882607.72553: done checking to see if all hosts have failed 30529 1726882607.72554: getting the remaining hosts for this loop 30529 1726882607.72556: done getting the remaining hosts for this loop 30529 1726882607.72559: getting the next task for host managed_node1 30529 1726882607.72570: done getting next task for host managed_node1 30529 1726882607.72574: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882607.72579: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882607.72645: getting variables 30529 1726882607.72647: in VariableManager get_vars() 30529 1726882607.72740: Calling all_inventory to load vars for managed_node1 30529 1726882607.72743: Calling groups_inventory to load vars for managed_node1 30529 1726882607.72745: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882607.72755: Calling all_plugins_play to load vars for managed_node1 30529 1726882607.72758: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882607.72760: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.73772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.74735: done with get_vars() 30529 1726882607.74751: done getting variables 30529 1726882607.74816: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:47 -0400 (0:00:00.091) 0:00:21.774 ****** 30529 1726882607.74841: entering _queue_task() for managed_node1/service 30529 1726882607.75163: worker is 1 (out of 1 available) 30529 1726882607.75176: exiting _queue_task() for managed_node1/service 30529 1726882607.75195: done queuing things up, now waiting for results queue to drain 30529 1726882607.75196: waiting for pending results... 30529 1726882607.75400: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882607.75500: in run() - task 12673a56-9f93-b0f1-edc0-00000000073d 30529 1726882607.75511: variable 'ansible_search_path' from source: unknown 30529 1726882607.75515: variable 'ansible_search_path' from source: unknown 30529 1726882607.75545: calling self._execute() 30529 1726882607.75617: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.75621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.75676: variable 'omit' from source: magic vars 30529 1726882607.76028: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.76039: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.76154: variable 'network_provider' from source: set_fact 30529 1726882607.76158: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882607.76161: when evaluation is False, skipping this task 30529 1726882607.76173: _execute() done 30529 1726882607.76176: dumping result to json 30529 1726882607.76179: done dumping result, returning 30529 1726882607.76181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-00000000073d] 30529 1726882607.76184: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073d 30529 1726882607.76284: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073d 30529 1726882607.76288: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882607.76360: no more pending results, returning what we have 30529 1726882607.76363: results queue empty 30529 1726882607.76364: checking for any_errors_fatal 30529 1726882607.76371: done checking for any_errors_fatal 30529 1726882607.76372: checking for max_fail_percentage 30529 1726882607.76373: done checking for max_fail_percentage 30529 1726882607.76375: checking to see if all hosts have failed and the running result is not ok 30529 1726882607.76376: done checking to see if all hosts have failed 30529 1726882607.76376: getting the remaining hosts for this loop 30529 1726882607.76378: done getting the remaining hosts for this loop 30529 1726882607.76382: getting the next task for host managed_node1 30529 1726882607.76392: done getting next task for host managed_node1 30529 1726882607.76397: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882607.76402: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882607.76421: getting variables 30529 1726882607.76422: in VariableManager get_vars() 30529 1726882607.76451: Calling all_inventory to load vars for managed_node1 30529 1726882607.76453: Calling groups_inventory to load vars for managed_node1 30529 1726882607.76455: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882607.76463: Calling all_plugins_play to load vars for managed_node1 30529 1726882607.76465: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882607.76468: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.77501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.78584: done with get_vars() 30529 1726882607.78606: done getting variables 30529 1726882607.78664: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:47 -0400 (0:00:00.038) 0:00:21.813 ****** 30529 1726882607.78699: entering _queue_task() for managed_node1/copy 30529 1726882607.78988: worker is 1 (out of 1 available) 30529 1726882607.79004: exiting _queue_task() for managed_node1/copy 30529 1726882607.79017: done queuing things up, now waiting for results queue to drain 30529 1726882607.79018: waiting for pending results... 30529 1726882607.79202: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882607.79290: in run() - task 12673a56-9f93-b0f1-edc0-00000000073e 30529 1726882607.79302: variable 'ansible_search_path' from source: unknown 30529 1726882607.79306: variable 'ansible_search_path' from source: unknown 30529 1726882607.79334: calling self._execute() 30529 1726882607.79405: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.79409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.79418: variable 'omit' from source: magic vars 30529 1726882607.79769: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.79773: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.79854: variable 'network_provider' from source: set_fact 30529 1726882607.79858: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882607.79862: when evaluation is False, skipping this task 30529 1726882607.79864: _execute() done 30529 1726882607.79867: dumping result to json 30529 1726882607.79870: done dumping result, returning 30529 1726882607.79883: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-00000000073e] 30529 1726882607.79889: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073e 30529 1726882607.79970: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073e 30529 1726882607.79973: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882607.80030: no more pending results, returning what we have 30529 1726882607.80034: results queue empty 30529 1726882607.80035: checking for any_errors_fatal 30529 1726882607.80042: done checking for any_errors_fatal 30529 1726882607.80042: checking for max_fail_percentage 30529 1726882607.80044: done checking for max_fail_percentage 30529 1726882607.80045: checking to see if all hosts have failed and the running result is not ok 30529 1726882607.80046: done checking to see if all hosts have failed 30529 1726882607.80046: getting the remaining hosts for this loop 30529 1726882607.80048: done getting the remaining hosts for this loop 30529 1726882607.80052: getting the next task for host managed_node1 30529 1726882607.80060: done getting next task for host managed_node1 30529 1726882607.80064: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882607.80069: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882607.80089: getting variables 30529 1726882607.80091: in VariableManager get_vars() 30529 1726882607.80126: Calling all_inventory to load vars for managed_node1 30529 1726882607.80128: Calling groups_inventory to load vars for managed_node1 30529 1726882607.80130: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882607.80141: Calling all_plugins_play to load vars for managed_node1 30529 1726882607.80144: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882607.80149: Calling groups_plugins_play to load vars for managed_node1 30529 1726882607.81185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882607.82589: done with get_vars() 30529 1726882607.82605: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:47 -0400 (0:00:00.039) 0:00:21.852 ****** 30529 1726882607.82679: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882607.82980: worker is 1 (out of 1 available) 30529 1726882607.83197: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882607.83208: done queuing things up, now waiting for results queue to drain 30529 1726882607.83209: waiting for pending results... 30529 1726882607.83335: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882607.83461: in run() - task 12673a56-9f93-b0f1-edc0-00000000073f 30529 1726882607.83699: variable 'ansible_search_path' from source: unknown 30529 1726882607.83703: variable 'ansible_search_path' from source: unknown 30529 1726882607.83706: calling self._execute() 30529 1726882607.83744: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.83974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.83978: variable 'omit' from source: magic vars 30529 1726882607.84263: variable 'ansible_distribution_major_version' from source: facts 30529 1726882607.84281: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882607.84296: variable 'omit' from source: magic vars 30529 1726882607.84363: variable 'omit' from source: magic vars 30529 1726882607.84517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882607.85941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882607.85984: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882607.86018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882607.86044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882607.86065: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882607.86128: variable 'network_provider' from source: set_fact 30529 1726882607.86300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882607.86304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882607.86307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882607.86344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882607.86363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882607.86441: variable 'omit' from source: magic vars 30529 1726882607.86549: variable 'omit' from source: magic vars 30529 1726882607.86656: variable 'network_connections' from source: include params 30529 1726882607.86670: variable 'interface' from source: play vars 30529 1726882607.86736: variable 'interface' from source: play vars 30529 1726882607.86892: variable 'omit' from source: magic vars 30529 1726882607.86906: variable '__lsr_ansible_managed' from source: task vars 30529 1726882607.87098: variable '__lsr_ansible_managed' from source: task vars 30529 1726882607.87141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882607.87332: Loaded config def from plugin (lookup/template) 30529 1726882607.87341: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882607.87368: File lookup term: get_ansible_managed.j2 30529 1726882607.87374: variable 'ansible_search_path' from source: unknown 30529 1726882607.87382: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882607.87404: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882607.87426: variable 'ansible_search_path' from source: unknown 30529 1726882607.92567: variable 'ansible_managed' from source: unknown 30529 1726882607.92645: variable 'omit' from source: magic vars 30529 1726882607.92664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882607.92685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882607.92703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882607.92716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882607.92724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882607.92744: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882607.92747: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.92750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.92818: Set connection var ansible_shell_executable to /bin/sh 30529 1726882607.92822: Set connection var ansible_pipelining to False 30529 1726882607.92824: Set connection var ansible_shell_type to sh 30529 1726882607.92832: Set connection var ansible_timeout to 10 30529 1726882607.92835: Set connection var ansible_connection to ssh 30529 1726882607.92839: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882607.92856: variable 'ansible_shell_executable' from source: unknown 30529 1726882607.92858: variable 'ansible_connection' from source: unknown 30529 1726882607.92861: variable 'ansible_module_compression' from source: unknown 30529 1726882607.92863: variable 'ansible_shell_type' from source: unknown 30529 1726882607.92865: variable 'ansible_shell_executable' from source: unknown 30529 1726882607.92869: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882607.92872: variable 'ansible_pipelining' from source: unknown 30529 1726882607.92874: variable 'ansible_timeout' from source: unknown 30529 1726882607.92876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882607.92966: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882607.92978: variable 'omit' from source: magic vars 30529 1726882607.92981: starting attempt loop 30529 1726882607.92983: running the handler 30529 1726882607.93000: _low_level_execute_command(): starting 30529 1726882607.93004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882607.93622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.93648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882607.93666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.93689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.93770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.95432: stdout chunk (state=3): >>>/root <<< 30529 1726882607.95550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.95560: stderr chunk (state=3): >>><<< 30529 1726882607.95563: stdout chunk (state=3): >>><<< 30529 1726882607.95578: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.95587: _low_level_execute_command(): starting 30529 1726882607.95597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078 `" && echo ansible-tmp-1726882607.9557827-31522-3938019223078="` echo /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078 `" ) && sleep 0' 30529 1726882607.96017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.96021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882607.96024: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.96026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.96077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.96129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.96184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882607.98035: stdout chunk (state=3): >>>ansible-tmp-1726882607.9557827-31522-3938019223078=/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078 <<< 30529 1726882607.98189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882607.98192: stdout chunk (state=3): >>><<< 30529 1726882607.98196: stderr chunk (state=3): >>><<< 30529 1726882607.98401: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882607.9557827-31522-3938019223078=/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882607.98406: variable 'ansible_module_compression' from source: unknown 30529 1726882607.98408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882607.98411: variable 'ansible_facts' from source: unknown 30529 1726882607.98509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py 30529 1726882607.98635: Sending initial data 30529 1726882607.98736: Sent initial data (166 bytes) 30529 1726882607.99302: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882607.99318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882607.99331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882607.99349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882607.99392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882607.99472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882607.99510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882607.99546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882607.99590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.01115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882608.01159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882608.01192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj8befyd5 /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py <<< 30529 1726882608.01224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py" <<< 30529 1726882608.01274: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj8befyd5" to remote "/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py" <<< 30529 1726882608.02506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882608.02510: stderr chunk (state=3): >>><<< 30529 1726882608.02512: stdout chunk (state=3): >>><<< 30529 1726882608.02514: done transferring module to remote 30529 1726882608.02528: _low_level_execute_command(): starting 30529 1726882608.02537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/ /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py && sleep 0' 30529 1726882608.03202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882608.03218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882608.03277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.03346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882608.03388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882608.03423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882608.03468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.05272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882608.05275: stdout chunk (state=3): >>><<< 30529 1726882608.05277: stderr chunk (state=3): >>><<< 30529 1726882608.05299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882608.05308: _low_level_execute_command(): starting 30529 1726882608.05414: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/AnsiballZ_network_connections.py && sleep 0' 30529 1726882608.06221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.06255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882608.06281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882608.06344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.31661: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882608.33382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882608.33385: stdout chunk (state=3): >>><<< 30529 1726882608.33388: stderr chunk (state=3): >>><<< 30529 1726882608.33411: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882608.33500: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882608.33503: _low_level_execute_command(): starting 30529 1726882608.33505: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882607.9557827-31522-3938019223078/ > /dev/null 2>&1 && sleep 0' 30529 1726882608.34063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882608.34078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882608.34101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882608.34119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882608.34136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882608.34148: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882608.34161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.34178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882608.34208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.34268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882608.34285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882608.34314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882608.34383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.36501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882608.36559: stdout chunk (state=3): >>><<< 30529 1726882608.36563: stderr chunk (state=3): >>><<< 30529 1726882608.36591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882608.36601: handler run complete 30529 1726882608.36640: attempt loop complete, returning result 30529 1726882608.36647: _execute() done 30529 1726882608.36652: dumping result to json 30529 1726882608.36661: done dumping result, returning 30529 1726882608.36673: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-00000000073f] 30529 1726882608.36705: sending task result for task 12673a56-9f93-b0f1-edc0-00000000073f changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140 30529 1726882608.37003: no more pending results, returning what we have 30529 1726882608.37006: results queue empty 30529 1726882608.37007: checking for any_errors_fatal 30529 1726882608.37011: done checking for any_errors_fatal 30529 1726882608.37012: checking for max_fail_percentage 30529 1726882608.37014: done checking for max_fail_percentage 30529 1726882608.37014: checking to see if all hosts have failed and the running result is not ok 30529 1726882608.37015: done checking to see if all hosts have failed 30529 1726882608.37016: getting the remaining hosts for this loop 30529 1726882608.37018: done getting the remaining hosts for this loop 30529 1726882608.37021: getting the next task for host managed_node1 30529 1726882608.37029: done getting next task for host managed_node1 30529 1726882608.37032: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882608.37036: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882608.37047: getting variables 30529 1726882608.37048: in VariableManager get_vars() 30529 1726882608.37082: Calling all_inventory to load vars for managed_node1 30529 1726882608.37084: Calling groups_inventory to load vars for managed_node1 30529 1726882608.37086: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882608.37325: Calling all_plugins_play to load vars for managed_node1 30529 1726882608.37329: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882608.37333: Calling groups_plugins_play to load vars for managed_node1 30529 1726882608.38101: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000073f 30529 1726882608.38105: WORKER PROCESS EXITING 30529 1726882608.39652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882608.41797: done with get_vars() 30529 1726882608.41817: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:48 -0400 (0:00:00.592) 0:00:22.445 ****** 30529 1726882608.41901: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882608.42649: worker is 1 (out of 1 available) 30529 1726882608.42663: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882608.42677: done queuing things up, now waiting for results queue to drain 30529 1726882608.42679: waiting for pending results... 30529 1726882608.43186: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882608.43750: in run() - task 12673a56-9f93-b0f1-edc0-000000000740 30529 1726882608.43775: variable 'ansible_search_path' from source: unknown 30529 1726882608.44101: variable 'ansible_search_path' from source: unknown 30529 1726882608.44105: calling self._execute() 30529 1726882608.44141: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.44152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.44167: variable 'omit' from source: magic vars 30529 1726882608.44879: variable 'ansible_distribution_major_version' from source: facts 30529 1726882608.45007: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882608.45217: variable 'network_state' from source: role '' defaults 30529 1726882608.45235: Evaluated conditional (network_state != {}): False 30529 1726882608.45248: when evaluation is False, skipping this task 30529 1726882608.45257: _execute() done 30529 1726882608.45264: dumping result to json 30529 1726882608.45273: done dumping result, returning 30529 1726882608.45290: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000000740] 30529 1726882608.45307: sending task result for task 12673a56-9f93-b0f1-edc0-000000000740 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882608.45508: no more pending results, returning what we have 30529 1726882608.45513: results queue empty 30529 1726882608.45514: checking for any_errors_fatal 30529 1726882608.45526: done checking for any_errors_fatal 30529 1726882608.45527: checking for max_fail_percentage 30529 1726882608.45529: done checking for max_fail_percentage 30529 1726882608.45530: checking to see if all hosts have failed and the running result is not ok 30529 1726882608.45531: done checking to see if all hosts have failed 30529 1726882608.45532: getting the remaining hosts for this loop 30529 1726882608.45533: done getting the remaining hosts for this loop 30529 1726882608.45537: getting the next task for host managed_node1 30529 1726882608.45545: done getting next task for host managed_node1 30529 1726882608.45549: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882608.45555: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882608.45579: getting variables 30529 1726882608.45581: in VariableManager get_vars() 30529 1726882608.45622: Calling all_inventory to load vars for managed_node1 30529 1726882608.45625: Calling groups_inventory to load vars for managed_node1 30529 1726882608.45627: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882608.45640: Calling all_plugins_play to load vars for managed_node1 30529 1726882608.45644: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882608.45647: Calling groups_plugins_play to load vars for managed_node1 30529 1726882608.46217: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000740 30529 1726882608.46222: WORKER PROCESS EXITING 30529 1726882608.47255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882608.50537: done with get_vars() 30529 1726882608.50565: done getting variables 30529 1726882608.50807: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:48 -0400 (0:00:00.089) 0:00:22.534 ****** 30529 1726882608.50846: entering _queue_task() for managed_node1/debug 30529 1726882608.51233: worker is 1 (out of 1 available) 30529 1726882608.51244: exiting _queue_task() for managed_node1/debug 30529 1726882608.51256: done queuing things up, now waiting for results queue to drain 30529 1726882608.51257: waiting for pending results... 30529 1726882608.51711: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882608.51717: in run() - task 12673a56-9f93-b0f1-edc0-000000000741 30529 1726882608.51721: variable 'ansible_search_path' from source: unknown 30529 1726882608.51725: variable 'ansible_search_path' from source: unknown 30529 1726882608.51728: calling self._execute() 30529 1726882608.51781: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.51797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.51811: variable 'omit' from source: magic vars 30529 1726882608.52180: variable 'ansible_distribution_major_version' from source: facts 30529 1726882608.52201: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882608.52211: variable 'omit' from source: magic vars 30529 1726882608.52283: variable 'omit' from source: magic vars 30529 1726882608.52325: variable 'omit' from source: magic vars 30529 1726882608.52368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882608.52422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882608.52446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882608.52468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.52486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.52599: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882608.52604: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.52606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.52647: Set connection var ansible_shell_executable to /bin/sh 30529 1726882608.52657: Set connection var ansible_pipelining to False 30529 1726882608.52664: Set connection var ansible_shell_type to sh 30529 1726882608.52677: Set connection var ansible_timeout to 10 30529 1726882608.52683: Set connection var ansible_connection to ssh 30529 1726882608.52700: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882608.52798: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.52801: variable 'ansible_connection' from source: unknown 30529 1726882608.52804: variable 'ansible_module_compression' from source: unknown 30529 1726882608.52806: variable 'ansible_shell_type' from source: unknown 30529 1726882608.52808: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.52810: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.52812: variable 'ansible_pipelining' from source: unknown 30529 1726882608.52815: variable 'ansible_timeout' from source: unknown 30529 1726882608.52817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.52917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882608.52936: variable 'omit' from source: magic vars 30529 1726882608.52946: starting attempt loop 30529 1726882608.52952: running the handler 30529 1726882608.53079: variable '__network_connections_result' from source: set_fact 30529 1726882608.53135: handler run complete 30529 1726882608.53163: attempt loop complete, returning result 30529 1726882608.53171: _execute() done 30529 1726882608.53177: dumping result to json 30529 1726882608.53264: done dumping result, returning 30529 1726882608.53268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000741] 30529 1726882608.53270: sending task result for task 12673a56-9f93-b0f1-edc0-000000000741 30529 1726882608.53337: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000741 30529 1726882608.53340: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140" ] } 30529 1726882608.53435: no more pending results, returning what we have 30529 1726882608.53438: results queue empty 30529 1726882608.53439: checking for any_errors_fatal 30529 1726882608.53446: done checking for any_errors_fatal 30529 1726882608.53447: checking for max_fail_percentage 30529 1726882608.53448: done checking for max_fail_percentage 30529 1726882608.53449: checking to see if all hosts have failed and the running result is not ok 30529 1726882608.53450: done checking to see if all hosts have failed 30529 1726882608.53451: getting the remaining hosts for this loop 30529 1726882608.53453: done getting the remaining hosts for this loop 30529 1726882608.53457: getting the next task for host managed_node1 30529 1726882608.53466: done getting next task for host managed_node1 30529 1726882608.53470: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882608.53476: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882608.53486: getting variables 30529 1726882608.53491: in VariableManager get_vars() 30529 1726882608.53526: Calling all_inventory to load vars for managed_node1 30529 1726882608.53529: Calling groups_inventory to load vars for managed_node1 30529 1726882608.53532: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882608.53543: Calling all_plugins_play to load vars for managed_node1 30529 1726882608.53546: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882608.53549: Calling groups_plugins_play to load vars for managed_node1 30529 1726882608.55934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882608.59248: done with get_vars() 30529 1726882608.59275: done getting variables 30529 1726882608.59342: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:48 -0400 (0:00:00.085) 0:00:22.619 ****** 30529 1726882608.59384: entering _queue_task() for managed_node1/debug 30529 1726882608.60133: worker is 1 (out of 1 available) 30529 1726882608.60146: exiting _queue_task() for managed_node1/debug 30529 1726882608.60157: done queuing things up, now waiting for results queue to drain 30529 1726882608.60158: waiting for pending results... 30529 1726882608.60815: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882608.61139: in run() - task 12673a56-9f93-b0f1-edc0-000000000742 30529 1726882608.61191: variable 'ansible_search_path' from source: unknown 30529 1726882608.61224: variable 'ansible_search_path' from source: unknown 30529 1726882608.61407: calling self._execute() 30529 1726882608.61801: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.61805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.61808: variable 'omit' from source: magic vars 30529 1726882608.62416: variable 'ansible_distribution_major_version' from source: facts 30529 1726882608.62706: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882608.62717: variable 'omit' from source: magic vars 30529 1726882608.62782: variable 'omit' from source: magic vars 30529 1726882608.63036: variable 'omit' from source: magic vars 30529 1726882608.63398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882608.63402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882608.63405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882608.63407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.63409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.63411: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882608.63414: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.63417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.63670: Set connection var ansible_shell_executable to /bin/sh 30529 1726882608.63998: Set connection var ansible_pipelining to False 30529 1726882608.64002: Set connection var ansible_shell_type to sh 30529 1726882608.64004: Set connection var ansible_timeout to 10 30529 1726882608.64006: Set connection var ansible_connection to ssh 30529 1726882608.64009: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882608.64011: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.64013: variable 'ansible_connection' from source: unknown 30529 1726882608.64017: variable 'ansible_module_compression' from source: unknown 30529 1726882608.64019: variable 'ansible_shell_type' from source: unknown 30529 1726882608.64021: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.64023: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.64025: variable 'ansible_pipelining' from source: unknown 30529 1726882608.64027: variable 'ansible_timeout' from source: unknown 30529 1726882608.64031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.64166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882608.64199: variable 'omit' from source: magic vars 30529 1726882608.64498: starting attempt loop 30529 1726882608.64502: running the handler 30529 1726882608.64505: variable '__network_connections_result' from source: set_fact 30529 1726882608.64545: variable '__network_connections_result' from source: set_fact 30529 1726882608.64678: handler run complete 30529 1726882608.64925: attempt loop complete, returning result 30529 1726882608.65098: _execute() done 30529 1726882608.65102: dumping result to json 30529 1726882608.65104: done dumping result, returning 30529 1726882608.65107: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000742] 30529 1726882608.65109: sending task result for task 12673a56-9f93-b0f1-edc0-000000000742 30529 1726882608.65183: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000742 30529 1726882608.65187: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140" ] } } 30529 1726882608.65278: no more pending results, returning what we have 30529 1726882608.65282: results queue empty 30529 1726882608.65284: checking for any_errors_fatal 30529 1726882608.65295: done checking for any_errors_fatal 30529 1726882608.65296: checking for max_fail_percentage 30529 1726882608.65298: done checking for max_fail_percentage 30529 1726882608.65299: checking to see if all hosts have failed and the running result is not ok 30529 1726882608.65300: done checking to see if all hosts have failed 30529 1726882608.65301: getting the remaining hosts for this loop 30529 1726882608.65302: done getting the remaining hosts for this loop 30529 1726882608.65306: getting the next task for host managed_node1 30529 1726882608.65315: done getting next task for host managed_node1 30529 1726882608.65318: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882608.65324: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882608.65335: getting variables 30529 1726882608.65343: in VariableManager get_vars() 30529 1726882608.65378: Calling all_inventory to load vars for managed_node1 30529 1726882608.65381: Calling groups_inventory to load vars for managed_node1 30529 1726882608.65384: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882608.65701: Calling all_plugins_play to load vars for managed_node1 30529 1726882608.65705: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882608.65708: Calling groups_plugins_play to load vars for managed_node1 30529 1726882608.67557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882608.69185: done with get_vars() 30529 1726882608.69515: done getting variables 30529 1726882608.69573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:48 -0400 (0:00:00.102) 0:00:22.722 ****** 30529 1726882608.69611: entering _queue_task() for managed_node1/debug 30529 1726882608.70344: worker is 1 (out of 1 available) 30529 1726882608.70359: exiting _queue_task() for managed_node1/debug 30529 1726882608.70371: done queuing things up, now waiting for results queue to drain 30529 1726882608.70372: waiting for pending results... 30529 1726882608.70570: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882608.70940: in run() - task 12673a56-9f93-b0f1-edc0-000000000743 30529 1726882608.70960: variable 'ansible_search_path' from source: unknown 30529 1726882608.70968: variable 'ansible_search_path' from source: unknown 30529 1726882608.71012: calling self._execute() 30529 1726882608.71287: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.71498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.71501: variable 'omit' from source: magic vars 30529 1726882608.71863: variable 'ansible_distribution_major_version' from source: facts 30529 1726882608.72113: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882608.72229: variable 'network_state' from source: role '' defaults 30529 1726882608.72499: Evaluated conditional (network_state != {}): False 30529 1726882608.72502: when evaluation is False, skipping this task 30529 1726882608.72504: _execute() done 30529 1726882608.72507: dumping result to json 30529 1726882608.72509: done dumping result, returning 30529 1726882608.72512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000000743] 30529 1726882608.72514: sending task result for task 12673a56-9f93-b0f1-edc0-000000000743 30529 1726882608.72586: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000743 30529 1726882608.72590: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882608.72635: no more pending results, returning what we have 30529 1726882608.72639: results queue empty 30529 1726882608.72640: checking for any_errors_fatal 30529 1726882608.72649: done checking for any_errors_fatal 30529 1726882608.72650: checking for max_fail_percentage 30529 1726882608.72652: done checking for max_fail_percentage 30529 1726882608.72653: checking to see if all hosts have failed and the running result is not ok 30529 1726882608.72654: done checking to see if all hosts have failed 30529 1726882608.72655: getting the remaining hosts for this loop 30529 1726882608.72656: done getting the remaining hosts for this loop 30529 1726882608.72660: getting the next task for host managed_node1 30529 1726882608.72668: done getting next task for host managed_node1 30529 1726882608.72672: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882608.72678: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882608.72699: getting variables 30529 1726882608.72701: in VariableManager get_vars() 30529 1726882608.72735: Calling all_inventory to load vars for managed_node1 30529 1726882608.72737: Calling groups_inventory to load vars for managed_node1 30529 1726882608.72739: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882608.72748: Calling all_plugins_play to load vars for managed_node1 30529 1726882608.72750: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882608.72753: Calling groups_plugins_play to load vars for managed_node1 30529 1726882608.84846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882608.87895: done with get_vars() 30529 1726882608.87931: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:48 -0400 (0:00:00.184) 0:00:22.906 ****** 30529 1726882608.88021: entering _queue_task() for managed_node1/ping 30529 1726882608.88882: worker is 1 (out of 1 available) 30529 1726882608.88896: exiting _queue_task() for managed_node1/ping 30529 1726882608.88908: done queuing things up, now waiting for results queue to drain 30529 1726882608.88909: waiting for pending results... 30529 1726882608.89155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882608.89322: in run() - task 12673a56-9f93-b0f1-edc0-000000000744 30529 1726882608.89344: variable 'ansible_search_path' from source: unknown 30529 1726882608.89354: variable 'ansible_search_path' from source: unknown 30529 1726882608.89397: calling self._execute() 30529 1726882608.89498: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.89513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.89532: variable 'omit' from source: magic vars 30529 1726882608.89958: variable 'ansible_distribution_major_version' from source: facts 30529 1726882608.89962: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882608.89965: variable 'omit' from source: magic vars 30529 1726882608.90008: variable 'omit' from source: magic vars 30529 1726882608.90045: variable 'omit' from source: magic vars 30529 1726882608.90096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882608.90134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882608.90155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882608.90181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.90286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882608.90291: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882608.90295: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.90297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.90347: Set connection var ansible_shell_executable to /bin/sh 30529 1726882608.90357: Set connection var ansible_pipelining to False 30529 1726882608.90364: Set connection var ansible_shell_type to sh 30529 1726882608.90377: Set connection var ansible_timeout to 10 30529 1726882608.90383: Set connection var ansible_connection to ssh 30529 1726882608.90501: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882608.90504: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.90506: variable 'ansible_connection' from source: unknown 30529 1726882608.90509: variable 'ansible_module_compression' from source: unknown 30529 1726882608.90511: variable 'ansible_shell_type' from source: unknown 30529 1726882608.90513: variable 'ansible_shell_executable' from source: unknown 30529 1726882608.90515: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882608.90517: variable 'ansible_pipelining' from source: unknown 30529 1726882608.90518: variable 'ansible_timeout' from source: unknown 30529 1726882608.90520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882608.90658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882608.90672: variable 'omit' from source: magic vars 30529 1726882608.90682: starting attempt loop 30529 1726882608.90688: running the handler 30529 1726882608.90708: _low_level_execute_command(): starting 30529 1726882608.90742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882608.91444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882608.91492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882608.91514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882608.91608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882608.91627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882608.91969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.93597: stdout chunk (state=3): >>>/root <<< 30529 1726882608.93733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882608.93737: stderr chunk (state=3): >>><<< 30529 1726882608.93743: stdout chunk (state=3): >>><<< 30529 1726882608.93771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882608.93785: _low_level_execute_command(): starting 30529 1726882608.93803: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231 `" && echo ansible-tmp-1726882608.9377058-31567-89914770509231="` echo /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231 `" ) && sleep 0' 30529 1726882608.94930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882608.95067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882608.95084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882608.95117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882608.95120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882608.95123: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882608.95134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.95224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882608.95227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882608.95229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882608.95232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882608.95234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882608.95237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882608.95238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882608.95241: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882608.95243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882608.95624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882608.95627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882608.97326: stdout chunk (state=3): >>>ansible-tmp-1726882608.9377058-31567-89914770509231=/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231 <<< 30529 1726882608.97454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882608.97458: stdout chunk (state=3): >>><<< 30529 1726882608.97465: stderr chunk (state=3): >>><<< 30529 1726882608.97717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882608.9377058-31567-89914770509231=/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882608.97767: variable 'ansible_module_compression' from source: unknown 30529 1726882608.97992: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882608.97998: variable 'ansible_facts' from source: unknown 30529 1726882608.98065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py 30529 1726882608.98798: Sending initial data 30529 1726882608.98803: Sent initial data (152 bytes) 30529 1726882608.99610: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882608.99618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882608.99710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882608.99908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.00017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.00089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.01614: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882609.01646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882609.01707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvayhp34r /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py <<< 30529 1726882609.01710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py" <<< 30529 1726882609.01801: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvayhp34r" to remote "/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py" <<< 30529 1726882609.03099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.03102: stdout chunk (state=3): >>><<< 30529 1726882609.03105: stderr chunk (state=3): >>><<< 30529 1726882609.03107: done transferring module to remote 30529 1726882609.03109: _low_level_execute_command(): starting 30529 1726882609.03111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/ /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py && sleep 0' 30529 1726882609.04421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.04430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882609.04440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882609.04453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882609.04464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882609.04471: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882609.04479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.04497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882609.04671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.04698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.04701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.04759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.06863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.06867: stdout chunk (state=3): >>><<< 30529 1726882609.06869: stderr chunk (state=3): >>><<< 30529 1726882609.06872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.06875: _low_level_execute_command(): starting 30529 1726882609.06877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/AnsiballZ_ping.py && sleep 0' 30529 1726882609.08010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.08022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.08040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.08054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.08125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.22744: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882609.23950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.23968: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882609.24034: stderr chunk (state=3): >>><<< 30529 1726882609.24049: stdout chunk (state=3): >>><<< 30529 1726882609.24079: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882609.24117: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882609.24141: _low_level_execute_command(): starting 30529 1726882609.24156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882608.9377058-31567-89914770509231/ > /dev/null 2>&1 && sleep 0' 30529 1726882609.24882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.24926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.24955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.24970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.25045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.26892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.26909: stdout chunk (state=3): >>><<< 30529 1726882609.26922: stderr chunk (state=3): >>><<< 30529 1726882609.26944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.27099: handler run complete 30529 1726882609.27102: attempt loop complete, returning result 30529 1726882609.27105: _execute() done 30529 1726882609.27108: dumping result to json 30529 1726882609.27110: done dumping result, returning 30529 1726882609.27112: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000000744] 30529 1726882609.27114: sending task result for task 12673a56-9f93-b0f1-edc0-000000000744 30529 1726882609.27180: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000744 30529 1726882609.27183: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882609.27251: no more pending results, returning what we have 30529 1726882609.27255: results queue empty 30529 1726882609.27256: checking for any_errors_fatal 30529 1726882609.27267: done checking for any_errors_fatal 30529 1726882609.27267: checking for max_fail_percentage 30529 1726882609.27269: done checking for max_fail_percentage 30529 1726882609.27270: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.27271: done checking to see if all hosts have failed 30529 1726882609.27272: getting the remaining hosts for this loop 30529 1726882609.27274: done getting the remaining hosts for this loop 30529 1726882609.27278: getting the next task for host managed_node1 30529 1726882609.27290: done getting next task for host managed_node1 30529 1726882609.27312: ^ task is: TASK: meta (role_complete) 30529 1726882609.27319: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.27331: getting variables 30529 1726882609.27333: in VariableManager get_vars() 30529 1726882609.27375: Calling all_inventory to load vars for managed_node1 30529 1726882609.27377: Calling groups_inventory to load vars for managed_node1 30529 1726882609.27380: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.27391: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.27600: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.27605: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.29101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.30617: done with get_vars() 30529 1726882609.30639: done getting variables 30529 1726882609.30717: done queuing things up, now waiting for results queue to drain 30529 1726882609.30719: results queue empty 30529 1726882609.30720: checking for any_errors_fatal 30529 1726882609.30722: done checking for any_errors_fatal 30529 1726882609.30723: checking for max_fail_percentage 30529 1726882609.30724: done checking for max_fail_percentage 30529 1726882609.30724: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.30725: done checking to see if all hosts have failed 30529 1726882609.30726: getting the remaining hosts for this loop 30529 1726882609.30727: done getting the remaining hosts for this loop 30529 1726882609.30729: getting the next task for host managed_node1 30529 1726882609.30733: done getting next task for host managed_node1 30529 1726882609.30735: ^ task is: TASK: Show result 30529 1726882609.30737: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.30739: getting variables 30529 1726882609.30740: in VariableManager get_vars() 30529 1726882609.30748: Calling all_inventory to load vars for managed_node1 30529 1726882609.30750: Calling groups_inventory to load vars for managed_node1 30529 1726882609.30752: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.30756: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.30758: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.30761: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.31943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.33484: done with get_vars() 30529 1726882609.33509: done getting variables 30529 1726882609.33548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 21:36:49 -0400 (0:00:00.455) 0:00:23.361 ****** 30529 1726882609.33580: entering _queue_task() for managed_node1/debug 30529 1726882609.34036: worker is 1 (out of 1 available) 30529 1726882609.34050: exiting _queue_task() for managed_node1/debug 30529 1726882609.34063: done queuing things up, now waiting for results queue to drain 30529 1726882609.34064: waiting for pending results... 30529 1726882609.34465: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882609.34539: in run() - task 12673a56-9f93-b0f1-edc0-0000000006b2 30529 1726882609.34668: variable 'ansible_search_path' from source: unknown 30529 1726882609.34672: variable 'ansible_search_path' from source: unknown 30529 1726882609.34676: calling self._execute() 30529 1726882609.34730: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.34741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.34756: variable 'omit' from source: magic vars 30529 1726882609.35163: variable 'ansible_distribution_major_version' from source: facts 30529 1726882609.35182: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882609.35197: variable 'omit' from source: magic vars 30529 1726882609.35253: variable 'omit' from source: magic vars 30529 1726882609.35296: variable 'omit' from source: magic vars 30529 1726882609.35347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882609.35382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882609.35424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882609.35427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882609.35441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882609.35470: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882609.35498: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.35500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.35581: Set connection var ansible_shell_executable to /bin/sh 30529 1726882609.35591: Set connection var ansible_pipelining to False 30529 1726882609.35602: Set connection var ansible_shell_type to sh 30529 1726882609.35641: Set connection var ansible_timeout to 10 30529 1726882609.35645: Set connection var ansible_connection to ssh 30529 1726882609.35647: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882609.35660: variable 'ansible_shell_executable' from source: unknown 30529 1726882609.35667: variable 'ansible_connection' from source: unknown 30529 1726882609.35673: variable 'ansible_module_compression' from source: unknown 30529 1726882609.35697: variable 'ansible_shell_type' from source: unknown 30529 1726882609.35699: variable 'ansible_shell_executable' from source: unknown 30529 1726882609.35701: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.35704: variable 'ansible_pipelining' from source: unknown 30529 1726882609.35706: variable 'ansible_timeout' from source: unknown 30529 1726882609.35708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.35840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882609.35898: variable 'omit' from source: magic vars 30529 1726882609.35901: starting attempt loop 30529 1726882609.35904: running the handler 30529 1726882609.35926: variable '__network_connections_result' from source: set_fact 30529 1726882609.36018: variable '__network_connections_result' from source: set_fact 30529 1726882609.36145: handler run complete 30529 1726882609.36175: attempt loop complete, returning result 30529 1726882609.36295: _execute() done 30529 1726882609.36299: dumping result to json 30529 1726882609.36301: done dumping result, returning 30529 1726882609.36304: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-0000000006b2] 30529 1726882609.36306: sending task result for task 12673a56-9f93-b0f1-edc0-0000000006b2 30529 1726882609.36381: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000006b2 30529 1726882609.36384: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, a53fa9d7-87cd-4e9e-bb37-caaa5cc02140" ] } } 30529 1726882609.36459: no more pending results, returning what we have 30529 1726882609.36462: results queue empty 30529 1726882609.36464: checking for any_errors_fatal 30529 1726882609.36465: done checking for any_errors_fatal 30529 1726882609.36466: checking for max_fail_percentage 30529 1726882609.36469: done checking for max_fail_percentage 30529 1726882609.36470: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.36471: done checking to see if all hosts have failed 30529 1726882609.36471: getting the remaining hosts for this loop 30529 1726882609.36473: done getting the remaining hosts for this loop 30529 1726882609.36477: getting the next task for host managed_node1 30529 1726882609.36487: done getting next task for host managed_node1 30529 1726882609.36490: ^ task is: TASK: Asserts 30529 1726882609.36495: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.36501: getting variables 30529 1726882609.36503: in VariableManager get_vars() 30529 1726882609.36535: Calling all_inventory to load vars for managed_node1 30529 1726882609.36537: Calling groups_inventory to load vars for managed_node1 30529 1726882609.36541: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.36552: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.36556: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.36559: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.38069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.39718: done with get_vars() 30529 1726882609.39739: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:36:49 -0400 (0:00:00.062) 0:00:23.424 ****** 30529 1726882609.39842: entering _queue_task() for managed_node1/include_tasks 30529 1726882609.40175: worker is 1 (out of 1 available) 30529 1726882609.40189: exiting _queue_task() for managed_node1/include_tasks 30529 1726882609.40204: done queuing things up, now waiting for results queue to drain 30529 1726882609.40206: waiting for pending results... 30529 1726882609.40611: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882609.40616: in run() - task 12673a56-9f93-b0f1-edc0-0000000005b9 30529 1726882609.40619: variable 'ansible_search_path' from source: unknown 30529 1726882609.40621: variable 'ansible_search_path' from source: unknown 30529 1726882609.40648: variable 'lsr_assert' from source: include params 30529 1726882609.40926: variable 'lsr_assert' from source: include params 30529 1726882609.40933: variable 'omit' from source: magic vars 30529 1726882609.41059: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.41073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.41086: variable 'omit' from source: magic vars 30529 1726882609.41324: variable 'ansible_distribution_major_version' from source: facts 30529 1726882609.41338: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882609.41348: variable 'item' from source: unknown 30529 1726882609.41418: variable 'item' from source: unknown 30529 1726882609.41451: variable 'item' from source: unknown 30529 1726882609.41518: variable 'item' from source: unknown 30529 1726882609.41777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.41781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.41783: variable 'omit' from source: magic vars 30529 1726882609.41931: variable 'ansible_distribution_major_version' from source: facts 30529 1726882609.41942: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882609.41950: variable 'item' from source: unknown 30529 1726882609.42100: variable 'item' from source: unknown 30529 1726882609.42103: variable 'item' from source: unknown 30529 1726882609.42114: variable 'item' from source: unknown 30529 1726882609.42200: dumping result to json 30529 1726882609.42216: done dumping result, returning 30529 1726882609.42227: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-0000000005b9] 30529 1726882609.42318: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b9 30529 1726882609.42359: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005b9 30529 1726882609.42362: WORKER PROCESS EXITING 30529 1726882609.42447: no more pending results, returning what we have 30529 1726882609.42452: in VariableManager get_vars() 30529 1726882609.42487: Calling all_inventory to load vars for managed_node1 30529 1726882609.42489: Calling groups_inventory to load vars for managed_node1 30529 1726882609.42492: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.42509: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.42513: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.42516: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.43937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.45421: done with get_vars() 30529 1726882609.45441: variable 'ansible_search_path' from source: unknown 30529 1726882609.45442: variable 'ansible_search_path' from source: unknown 30529 1726882609.45484: variable 'ansible_search_path' from source: unknown 30529 1726882609.45486: variable 'ansible_search_path' from source: unknown 30529 1726882609.45518: we have included files to process 30529 1726882609.45519: generating all_blocks data 30529 1726882609.45522: done generating all_blocks data 30529 1726882609.45528: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882609.45529: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882609.45531: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882609.45644: in VariableManager get_vars() 30529 1726882609.45664: done with get_vars() 30529 1726882609.45776: done processing included file 30529 1726882609.45778: iterating over new_blocks loaded from include file 30529 1726882609.45779: in VariableManager get_vars() 30529 1726882609.45794: done with get_vars() 30529 1726882609.45796: filtering new block on tags 30529 1726882609.45835: done filtering new block on tags 30529 1726882609.45838: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 => (item=tasks/assert_device_absent.yml) 30529 1726882609.45843: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882609.45844: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882609.45847: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882609.45947: in VariableManager get_vars() 30529 1726882609.45966: done with get_vars() 30529 1726882609.46210: done processing included file 30529 1726882609.46212: iterating over new_blocks loaded from include file 30529 1726882609.46214: in VariableManager get_vars() 30529 1726882609.46228: done with get_vars() 30529 1726882609.46229: filtering new block on tags 30529 1726882609.46276: done filtering new block on tags 30529 1726882609.46279: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=tasks/assert_profile_present.yml) 30529 1726882609.46283: extending task lists for all hosts with included blocks 30529 1726882609.47382: done extending task lists 30529 1726882609.47384: done processing included files 30529 1726882609.47384: results queue empty 30529 1726882609.47385: checking for any_errors_fatal 30529 1726882609.47390: done checking for any_errors_fatal 30529 1726882609.47390: checking for max_fail_percentage 30529 1726882609.47391: done checking for max_fail_percentage 30529 1726882609.47392: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.47506: done checking to see if all hosts have failed 30529 1726882609.47508: getting the remaining hosts for this loop 30529 1726882609.47509: done getting the remaining hosts for this loop 30529 1726882609.47512: getting the next task for host managed_node1 30529 1726882609.47516: done getting next task for host managed_node1 30529 1726882609.47518: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882609.47521: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.47529: getting variables 30529 1726882609.47530: in VariableManager get_vars() 30529 1726882609.47538: Calling all_inventory to load vars for managed_node1 30529 1726882609.47540: Calling groups_inventory to load vars for managed_node1 30529 1726882609.47542: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.47547: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.47549: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.47551: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.48663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.50121: done with get_vars() 30529 1726882609.50140: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:49 -0400 (0:00:00.103) 0:00:23.528 ****** 30529 1726882609.50212: entering _queue_task() for managed_node1/include_tasks 30529 1726882609.50564: worker is 1 (out of 1 available) 30529 1726882609.50577: exiting _queue_task() for managed_node1/include_tasks 30529 1726882609.50589: done queuing things up, now waiting for results queue to drain 30529 1726882609.50590: waiting for pending results... 30529 1726882609.50879: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882609.50990: in run() - task 12673a56-9f93-b0f1-edc0-0000000008a8 30529 1726882609.51020: variable 'ansible_search_path' from source: unknown 30529 1726882609.51023: variable 'ansible_search_path' from source: unknown 30529 1726882609.51130: calling self._execute() 30529 1726882609.51157: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.51169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.51185: variable 'omit' from source: magic vars 30529 1726882609.51539: variable 'ansible_distribution_major_version' from source: facts 30529 1726882609.51554: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882609.51568: _execute() done 30529 1726882609.51574: dumping result to json 30529 1726882609.51579: done dumping result, returning 30529 1726882609.51586: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-0000000008a8] 30529 1726882609.51597: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008a8 30529 1726882609.51829: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008a8 30529 1726882609.51832: WORKER PROCESS EXITING 30529 1726882609.51858: no more pending results, returning what we have 30529 1726882609.51863: in VariableManager get_vars() 30529 1726882609.51900: Calling all_inventory to load vars for managed_node1 30529 1726882609.51903: Calling groups_inventory to load vars for managed_node1 30529 1726882609.51906: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.51919: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.51922: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.51924: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.53305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.54854: done with get_vars() 30529 1726882609.54871: variable 'ansible_search_path' from source: unknown 30529 1726882609.54872: variable 'ansible_search_path' from source: unknown 30529 1726882609.54879: variable 'item' from source: include params 30529 1726882609.54979: variable 'item' from source: include params 30529 1726882609.55012: we have included files to process 30529 1726882609.55014: generating all_blocks data 30529 1726882609.55015: done generating all_blocks data 30529 1726882609.55017: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882609.55018: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882609.55020: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882609.55184: done processing included file 30529 1726882609.55186: iterating over new_blocks loaded from include file 30529 1726882609.55187: in VariableManager get_vars() 30529 1726882609.55203: done with get_vars() 30529 1726882609.55205: filtering new block on tags 30529 1726882609.55232: done filtering new block on tags 30529 1726882609.55234: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882609.55239: extending task lists for all hosts with included blocks 30529 1726882609.55388: done extending task lists 30529 1726882609.55389: done processing included files 30529 1726882609.55390: results queue empty 30529 1726882609.55390: checking for any_errors_fatal 30529 1726882609.55396: done checking for any_errors_fatal 30529 1726882609.55397: checking for max_fail_percentage 30529 1726882609.55398: done checking for max_fail_percentage 30529 1726882609.55399: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.55399: done checking to see if all hosts have failed 30529 1726882609.55400: getting the remaining hosts for this loop 30529 1726882609.55402: done getting the remaining hosts for this loop 30529 1726882609.55404: getting the next task for host managed_node1 30529 1726882609.55409: done getting next task for host managed_node1 30529 1726882609.55411: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882609.55414: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.55416: getting variables 30529 1726882609.55417: in VariableManager get_vars() 30529 1726882609.55426: Calling all_inventory to load vars for managed_node1 30529 1726882609.55428: Calling groups_inventory to load vars for managed_node1 30529 1726882609.55430: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.55435: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.55437: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.55440: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.56579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882609.58065: done with get_vars() 30529 1726882609.58087: done getting variables 30529 1726882609.58214: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:49 -0400 (0:00:00.080) 0:00:23.608 ****** 30529 1726882609.58246: entering _queue_task() for managed_node1/stat 30529 1726882609.58822: worker is 1 (out of 1 available) 30529 1726882609.58830: exiting _queue_task() for managed_node1/stat 30529 1726882609.58840: done queuing things up, now waiting for results queue to drain 30529 1726882609.58843: waiting for pending results... 30529 1726882609.58900: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882609.59067: in run() - task 12673a56-9f93-b0f1-edc0-000000000928 30529 1726882609.59071: variable 'ansible_search_path' from source: unknown 30529 1726882609.59073: variable 'ansible_search_path' from source: unknown 30529 1726882609.59099: calling self._execute() 30529 1726882609.59183: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.59194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.59209: variable 'omit' from source: magic vars 30529 1726882609.59608: variable 'ansible_distribution_major_version' from source: facts 30529 1726882609.59611: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882609.59614: variable 'omit' from source: magic vars 30529 1726882609.59619: variable 'omit' from source: magic vars 30529 1726882609.59705: variable 'interface' from source: play vars 30529 1726882609.59734: variable 'omit' from source: magic vars 30529 1726882609.59775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882609.59813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882609.59838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882609.59857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882609.59872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882609.59906: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882609.59914: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.59920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.60030: Set connection var ansible_shell_executable to /bin/sh 30529 1726882609.60097: Set connection var ansible_pipelining to False 30529 1726882609.60100: Set connection var ansible_shell_type to sh 30529 1726882609.60103: Set connection var ansible_timeout to 10 30529 1726882609.60105: Set connection var ansible_connection to ssh 30529 1726882609.60107: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882609.60109: variable 'ansible_shell_executable' from source: unknown 30529 1726882609.60111: variable 'ansible_connection' from source: unknown 30529 1726882609.60112: variable 'ansible_module_compression' from source: unknown 30529 1726882609.60114: variable 'ansible_shell_type' from source: unknown 30529 1726882609.60116: variable 'ansible_shell_executable' from source: unknown 30529 1726882609.60120: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882609.60126: variable 'ansible_pipelining' from source: unknown 30529 1726882609.60132: variable 'ansible_timeout' from source: unknown 30529 1726882609.60138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882609.60321: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882609.60338: variable 'omit' from source: magic vars 30529 1726882609.60348: starting attempt loop 30529 1726882609.60355: running the handler 30529 1726882609.60477: _low_level_execute_command(): starting 30529 1726882609.60481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882609.61079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.61108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882609.61134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882609.61229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.61251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.61265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.61466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.63014: stdout chunk (state=3): >>>/root <<< 30529 1726882609.63177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.63180: stdout chunk (state=3): >>><<< 30529 1726882609.63183: stderr chunk (state=3): >>><<< 30529 1726882609.63291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.63297: _low_level_execute_command(): starting 30529 1726882609.63300: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063 `" && echo ansible-tmp-1726882609.6320496-31611-27042405056063="` echo /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063 `" ) && sleep 0' 30529 1726882609.63877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.63962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.64008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.64026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.64044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.64129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.66026: stdout chunk (state=3): >>>ansible-tmp-1726882609.6320496-31611-27042405056063=/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063 <<< 30529 1726882609.66131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.66218: stderr chunk (state=3): >>><<< 30529 1726882609.66221: stdout chunk (state=3): >>><<< 30529 1726882609.66599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882609.6320496-31611-27042405056063=/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.66603: variable 'ansible_module_compression' from source: unknown 30529 1726882609.66606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882609.66608: variable 'ansible_facts' from source: unknown 30529 1726882609.66707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py 30529 1726882609.66961: Sending initial data 30529 1726882609.67009: Sent initial data (152 bytes) 30529 1726882609.68340: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882609.68376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.68487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.68665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.70216: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882609.70262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882609.70359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmplkz7t7ar /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py <<< 30529 1726882609.70374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py" <<< 30529 1726882609.70395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30529 1726882609.70415: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmplkz7t7ar" to remote "/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py" <<< 30529 1726882609.71185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.71197: stdout chunk (state=3): >>><<< 30529 1726882609.71211: stderr chunk (state=3): >>><<< 30529 1726882609.71274: done transferring module to remote 30529 1726882609.71324: _low_level_execute_command(): starting 30529 1726882609.71335: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/ /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py && sleep 0' 30529 1726882609.71876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.71890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882609.71913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882609.71931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882609.71949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882609.71964: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882609.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.72009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882609.72105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.72308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.72456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.74199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.74209: stdout chunk (state=3): >>><<< 30529 1726882609.74219: stderr chunk (state=3): >>><<< 30529 1726882609.74245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.74254: _low_level_execute_command(): starting 30529 1726882609.74268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/AnsiballZ_stat.py && sleep 0' 30529 1726882609.74819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.74837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882609.74857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882609.74874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882609.74946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882609.74978: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.75119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882609.75155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.75310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.75364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.90223: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882609.91800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882609.91804: stdout chunk (state=3): >>><<< 30529 1726882609.91807: stderr chunk (state=3): >>><<< 30529 1726882609.91809: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882609.91812: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882609.91814: _low_level_execute_command(): starting 30529 1726882609.91816: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882609.6320496-31611-27042405056063/ > /dev/null 2>&1 && sleep 0' 30529 1726882609.92943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882609.93058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882609.93071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882609.93207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882609.93276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882609.93332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882609.95135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882609.95499: stderr chunk (state=3): >>><<< 30529 1726882609.95507: stdout chunk (state=3): >>><<< 30529 1726882609.95511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882609.95513: handler run complete 30529 1726882609.95516: attempt loop complete, returning result 30529 1726882609.95517: _execute() done 30529 1726882609.95520: dumping result to json 30529 1726882609.95521: done dumping result, returning 30529 1726882609.95523: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000000928] 30529 1726882609.95525: sending task result for task 12673a56-9f93-b0f1-edc0-000000000928 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882609.95667: no more pending results, returning what we have 30529 1726882609.95671: results queue empty 30529 1726882609.95672: checking for any_errors_fatal 30529 1726882609.95674: done checking for any_errors_fatal 30529 1726882609.95675: checking for max_fail_percentage 30529 1726882609.95677: done checking for max_fail_percentage 30529 1726882609.95678: checking to see if all hosts have failed and the running result is not ok 30529 1726882609.95679: done checking to see if all hosts have failed 30529 1726882609.95680: getting the remaining hosts for this loop 30529 1726882609.95682: done getting the remaining hosts for this loop 30529 1726882609.95686: getting the next task for host managed_node1 30529 1726882609.95698: done getting next task for host managed_node1 30529 1726882609.95700: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30529 1726882609.95704: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882609.95710: getting variables 30529 1726882609.95712: in VariableManager get_vars() 30529 1726882609.95742: Calling all_inventory to load vars for managed_node1 30529 1726882609.95744: Calling groups_inventory to load vars for managed_node1 30529 1726882609.95747: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882609.95879: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000928 30529 1726882609.95882: WORKER PROCESS EXITING 30529 1726882609.95892: Calling all_plugins_play to load vars for managed_node1 30529 1726882609.95901: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882609.95905: Calling groups_plugins_play to load vars for managed_node1 30529 1726882609.98374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.01827: done with get_vars() 30529 1726882610.01850: done getting variables 30529 1726882610.01914: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882610.02033: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:50 -0400 (0:00:00.438) 0:00:24.046 ****** 30529 1726882610.02065: entering _queue_task() for managed_node1/assert 30529 1726882610.02822: worker is 1 (out of 1 available) 30529 1726882610.02836: exiting _queue_task() for managed_node1/assert 30529 1726882610.02852: done queuing things up, now waiting for results queue to drain 30529 1726882610.02854: waiting for pending results... 30529 1726882610.03286: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' 30529 1726882610.03542: in run() - task 12673a56-9f93-b0f1-edc0-0000000008a9 30529 1726882610.03563: variable 'ansible_search_path' from source: unknown 30529 1726882610.03753: variable 'ansible_search_path' from source: unknown 30529 1726882610.03757: calling self._execute() 30529 1726882610.03887: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.03906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.03922: variable 'omit' from source: magic vars 30529 1726882610.04724: variable 'ansible_distribution_major_version' from source: facts 30529 1726882610.04742: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882610.04752: variable 'omit' from source: magic vars 30529 1726882610.04808: variable 'omit' from source: magic vars 30529 1726882610.05134: variable 'interface' from source: play vars 30529 1726882610.05138: variable 'omit' from source: magic vars 30529 1726882610.05172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882610.05277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882610.05307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882610.05370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.05387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.05568: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882610.05571: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.05574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.05661: Set connection var ansible_shell_executable to /bin/sh 30529 1726882610.05798: Set connection var ansible_pipelining to False 30529 1726882610.05897: Set connection var ansible_shell_type to sh 30529 1726882610.05900: Set connection var ansible_timeout to 10 30529 1726882610.05903: Set connection var ansible_connection to ssh 30529 1726882610.05905: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882610.05907: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.05909: variable 'ansible_connection' from source: unknown 30529 1726882610.05911: variable 'ansible_module_compression' from source: unknown 30529 1726882610.05913: variable 'ansible_shell_type' from source: unknown 30529 1726882610.05915: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.05917: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.05919: variable 'ansible_pipelining' from source: unknown 30529 1726882610.05921: variable 'ansible_timeout' from source: unknown 30529 1726882610.05923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.06223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882610.06239: variable 'omit' from source: magic vars 30529 1726882610.06249: starting attempt loop 30529 1726882610.06256: running the handler 30529 1726882610.06603: variable 'interface_stat' from source: set_fact 30529 1726882610.06619: Evaluated conditional (not interface_stat.stat.exists): True 30529 1726882610.06629: handler run complete 30529 1726882610.06760: attempt loop complete, returning result 30529 1726882610.06764: _execute() done 30529 1726882610.06767: dumping result to json 30529 1726882610.06769: done dumping result, returning 30529 1726882610.06771: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' [12673a56-9f93-b0f1-edc0-0000000008a9] 30529 1726882610.06773: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008a9 30529 1726882610.06848: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008a9 30529 1726882610.06852: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882610.06904: no more pending results, returning what we have 30529 1726882610.06908: results queue empty 30529 1726882610.06909: checking for any_errors_fatal 30529 1726882610.06919: done checking for any_errors_fatal 30529 1726882610.06919: checking for max_fail_percentage 30529 1726882610.06921: done checking for max_fail_percentage 30529 1726882610.06922: checking to see if all hosts have failed and the running result is not ok 30529 1726882610.06923: done checking to see if all hosts have failed 30529 1726882610.06924: getting the remaining hosts for this loop 30529 1726882610.06926: done getting the remaining hosts for this loop 30529 1726882610.06929: getting the next task for host managed_node1 30529 1726882610.06940: done getting next task for host managed_node1 30529 1726882610.06943: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882610.06948: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882610.06954: getting variables 30529 1726882610.06956: in VariableManager get_vars() 30529 1726882610.06988: Calling all_inventory to load vars for managed_node1 30529 1726882610.06991: Calling groups_inventory to load vars for managed_node1 30529 1726882610.06996: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882610.07018: Calling all_plugins_play to load vars for managed_node1 30529 1726882610.07021: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882610.07024: Calling groups_plugins_play to load vars for managed_node1 30529 1726882610.09490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.12663: done with get_vars() 30529 1726882610.12688: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:36:50 -0400 (0:00:00.109) 0:00:24.155 ****** 30529 1726882610.12984: entering _queue_task() for managed_node1/include_tasks 30529 1726882610.13737: worker is 1 (out of 1 available) 30529 1726882610.13748: exiting _queue_task() for managed_node1/include_tasks 30529 1726882610.13761: done queuing things up, now waiting for results queue to drain 30529 1726882610.13763: waiting for pending results... 30529 1726882610.14214: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882610.14341: in run() - task 12673a56-9f93-b0f1-edc0-0000000008ad 30529 1726882610.14362: variable 'ansible_search_path' from source: unknown 30529 1726882610.14371: variable 'ansible_search_path' from source: unknown 30529 1726882610.14441: calling self._execute() 30529 1726882610.14587: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.14854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.14858: variable 'omit' from source: magic vars 30529 1726882610.15599: variable 'ansible_distribution_major_version' from source: facts 30529 1726882610.15604: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882610.15607: _execute() done 30529 1726882610.15609: dumping result to json 30529 1726882610.15613: done dumping result, returning 30529 1726882610.15615: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-0000000008ad] 30529 1726882610.15618: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008ad 30529 1726882610.15684: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008ad 30529 1726882610.15688: WORKER PROCESS EXITING 30529 1726882610.15724: no more pending results, returning what we have 30529 1726882610.15730: in VariableManager get_vars() 30529 1726882610.15768: Calling all_inventory to load vars for managed_node1 30529 1726882610.15770: Calling groups_inventory to load vars for managed_node1 30529 1726882610.15774: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882610.15788: Calling all_plugins_play to load vars for managed_node1 30529 1726882610.15792: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882610.15798: Calling groups_plugins_play to load vars for managed_node1 30529 1726882610.19521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.23511: done with get_vars() 30529 1726882610.23534: variable 'ansible_search_path' from source: unknown 30529 1726882610.23536: variable 'ansible_search_path' from source: unknown 30529 1726882610.23545: variable 'item' from source: include params 30529 1726882610.23651: variable 'item' from source: include params 30529 1726882610.23684: we have included files to process 30529 1726882610.23685: generating all_blocks data 30529 1726882610.23687: done generating all_blocks data 30529 1726882610.23691: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882610.23692: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882610.23898: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882610.25598: done processing included file 30529 1726882610.25600: iterating over new_blocks loaded from include file 30529 1726882610.25602: in VariableManager get_vars() 30529 1726882610.25619: done with get_vars() 30529 1726882610.25620: filtering new block on tags 30529 1726882610.25687: done filtering new block on tags 30529 1726882610.25689: in VariableManager get_vars() 30529 1726882610.25908: done with get_vars() 30529 1726882610.25910: filtering new block on tags 30529 1726882610.25964: done filtering new block on tags 30529 1726882610.25967: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882610.25972: extending task lists for all hosts with included blocks 30529 1726882610.26487: done extending task lists 30529 1726882610.26488: done processing included files 30529 1726882610.26489: results queue empty 30529 1726882610.26490: checking for any_errors_fatal 30529 1726882610.26695: done checking for any_errors_fatal 30529 1726882610.26697: checking for max_fail_percentage 30529 1726882610.26698: done checking for max_fail_percentage 30529 1726882610.26699: checking to see if all hosts have failed and the running result is not ok 30529 1726882610.26700: done checking to see if all hosts have failed 30529 1726882610.26701: getting the remaining hosts for this loop 30529 1726882610.26702: done getting the remaining hosts for this loop 30529 1726882610.26705: getting the next task for host managed_node1 30529 1726882610.26710: done getting next task for host managed_node1 30529 1726882610.26712: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882610.26715: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882610.26717: getting variables 30529 1726882610.26718: in VariableManager get_vars() 30529 1726882610.26726: Calling all_inventory to load vars for managed_node1 30529 1726882610.26728: Calling groups_inventory to load vars for managed_node1 30529 1726882610.26730: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882610.26736: Calling all_plugins_play to load vars for managed_node1 30529 1726882610.26738: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882610.26741: Calling groups_plugins_play to load vars for managed_node1 30529 1726882610.28621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.31719: done with get_vars() 30529 1726882610.31742: done getting variables 30529 1726882610.31785: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:50 -0400 (0:00:00.188) 0:00:24.344 ****** 30529 1726882610.31821: entering _queue_task() for managed_node1/set_fact 30529 1726882610.32578: worker is 1 (out of 1 available) 30529 1726882610.32591: exiting _queue_task() for managed_node1/set_fact 30529 1726882610.32806: done queuing things up, now waiting for results queue to drain 30529 1726882610.32808: waiting for pending results... 30529 1726882610.33316: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882610.33321: in run() - task 12673a56-9f93-b0f1-edc0-000000000946 30529 1726882610.33419: variable 'ansible_search_path' from source: unknown 30529 1726882610.33700: variable 'ansible_search_path' from source: unknown 30529 1726882610.33704: calling self._execute() 30529 1726882610.33718: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.33733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.33748: variable 'omit' from source: magic vars 30529 1726882610.34418: variable 'ansible_distribution_major_version' from source: facts 30529 1726882610.34599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882610.34602: variable 'omit' from source: magic vars 30529 1726882610.34605: variable 'omit' from source: magic vars 30529 1726882610.34607: variable 'omit' from source: magic vars 30529 1726882610.34741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882610.34815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882610.34923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882610.34945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.34965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.35037: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882610.35047: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.35056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.35280: Set connection var ansible_shell_executable to /bin/sh 30529 1726882610.35291: Set connection var ansible_pipelining to False 30529 1726882610.35398: Set connection var ansible_shell_type to sh 30529 1726882610.35402: Set connection var ansible_timeout to 10 30529 1726882610.35405: Set connection var ansible_connection to ssh 30529 1726882610.35407: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882610.35419: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.35426: variable 'ansible_connection' from source: unknown 30529 1726882610.35434: variable 'ansible_module_compression' from source: unknown 30529 1726882610.35441: variable 'ansible_shell_type' from source: unknown 30529 1726882610.35474: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.35483: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.35491: variable 'ansible_pipelining' from source: unknown 30529 1726882610.35501: variable 'ansible_timeout' from source: unknown 30529 1726882610.35687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.35835: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882610.35852: variable 'omit' from source: magic vars 30529 1726882610.35863: starting attempt loop 30529 1726882610.35869: running the handler 30529 1726882610.35916: handler run complete 30529 1726882610.35932: attempt loop complete, returning result 30529 1726882610.36099: _execute() done 30529 1726882610.36102: dumping result to json 30529 1726882610.36104: done dumping result, returning 30529 1726882610.36107: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-000000000946] 30529 1726882610.36109: sending task result for task 12673a56-9f93-b0f1-edc0-000000000946 30529 1726882610.36175: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000946 30529 1726882610.36179: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882610.36235: no more pending results, returning what we have 30529 1726882610.36240: results queue empty 30529 1726882610.36241: checking for any_errors_fatal 30529 1726882610.36243: done checking for any_errors_fatal 30529 1726882610.36243: checking for max_fail_percentage 30529 1726882610.36246: done checking for max_fail_percentage 30529 1726882610.36247: checking to see if all hosts have failed and the running result is not ok 30529 1726882610.36248: done checking to see if all hosts have failed 30529 1726882610.36249: getting the remaining hosts for this loop 30529 1726882610.36250: done getting the remaining hosts for this loop 30529 1726882610.36255: getting the next task for host managed_node1 30529 1726882610.36264: done getting next task for host managed_node1 30529 1726882610.36268: ^ task is: TASK: Stat profile file 30529 1726882610.36274: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882610.36277: getting variables 30529 1726882610.36279: in VariableManager get_vars() 30529 1726882610.36314: Calling all_inventory to load vars for managed_node1 30529 1726882610.36317: Calling groups_inventory to load vars for managed_node1 30529 1726882610.36320: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882610.36332: Calling all_plugins_play to load vars for managed_node1 30529 1726882610.36336: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882610.36339: Calling groups_plugins_play to load vars for managed_node1 30529 1726882610.39282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.42424: done with get_vars() 30529 1726882610.42451: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:50 -0400 (0:00:00.109) 0:00:24.453 ****** 30529 1726882610.42757: entering _queue_task() for managed_node1/stat 30529 1726882610.43510: worker is 1 (out of 1 available) 30529 1726882610.43523: exiting _queue_task() for managed_node1/stat 30529 1726882610.43535: done queuing things up, now waiting for results queue to drain 30529 1726882610.43538: waiting for pending results... 30529 1726882610.44410: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882610.44417: in run() - task 12673a56-9f93-b0f1-edc0-000000000947 30529 1726882610.44420: variable 'ansible_search_path' from source: unknown 30529 1726882610.44805: variable 'ansible_search_path' from source: unknown 30529 1726882610.44840: calling self._execute() 30529 1726882610.44930: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.44933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.44973: variable 'omit' from source: magic vars 30529 1726882610.46117: variable 'ansible_distribution_major_version' from source: facts 30529 1726882610.46131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882610.46138: variable 'omit' from source: magic vars 30529 1726882610.46798: variable 'omit' from source: magic vars 30529 1726882610.46801: variable 'profile' from source: play vars 30529 1726882610.46804: variable 'interface' from source: play vars 30529 1726882610.46806: variable 'interface' from source: play vars 30529 1726882610.46808: variable 'omit' from source: magic vars 30529 1726882610.47034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882610.47070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882610.47091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882610.47114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.47129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882610.47160: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882610.47164: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.47169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.47679: Set connection var ansible_shell_executable to /bin/sh 30529 1726882610.47682: Set connection var ansible_pipelining to False 30529 1726882610.47685: Set connection var ansible_shell_type to sh 30529 1726882610.47800: Set connection var ansible_timeout to 10 30529 1726882610.47803: Set connection var ansible_connection to ssh 30529 1726882610.47806: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882610.47808: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.47811: variable 'ansible_connection' from source: unknown 30529 1726882610.47813: variable 'ansible_module_compression' from source: unknown 30529 1726882610.47815: variable 'ansible_shell_type' from source: unknown 30529 1726882610.47817: variable 'ansible_shell_executable' from source: unknown 30529 1726882610.47819: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882610.47821: variable 'ansible_pipelining' from source: unknown 30529 1726882610.47824: variable 'ansible_timeout' from source: unknown 30529 1726882610.47826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882610.48544: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882610.48556: variable 'omit' from source: magic vars 30529 1726882610.48562: starting attempt loop 30529 1726882610.48565: running the handler 30529 1726882610.48576: _low_level_execute_command(): starting 30529 1726882610.48584: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882610.50171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882610.50188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882610.50615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882610.50704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882610.50772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.52449: stdout chunk (state=3): >>>/root <<< 30529 1726882610.52782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882610.53319: stderr chunk (state=3): >>><<< 30529 1726882610.53322: stdout chunk (state=3): >>><<< 30529 1726882610.53326: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882610.53328: _low_level_execute_command(): starting 30529 1726882610.53331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198 `" && echo ansible-tmp-1726882610.532338-31657-142192868475198="` echo /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198 `" ) && sleep 0' 30529 1726882610.54485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882610.54492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882610.54496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.54509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882610.54512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.54671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882610.54675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882610.55021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.57034: stdout chunk (state=3): >>>ansible-tmp-1726882610.532338-31657-142192868475198=/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198 <<< 30529 1726882610.57112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882610.57143: stderr chunk (state=3): >>><<< 30529 1726882610.57152: stdout chunk (state=3): >>><<< 30529 1726882610.57174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882610.532338-31657-142192868475198=/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882610.57697: variable 'ansible_module_compression' from source: unknown 30529 1726882610.57700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882610.57703: variable 'ansible_facts' from source: unknown 30529 1726882610.57933: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py 30529 1726882610.58316: Sending initial data 30529 1726882610.58320: Sent initial data (152 bytes) 30529 1726882610.59411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.59527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882610.59620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882610.59624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882610.59752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.61206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882610.61235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882610.61306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp39uqxfrf /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py <<< 30529 1726882610.61310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py" <<< 30529 1726882610.61371: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp39uqxfrf" to remote "/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py" <<< 30529 1726882610.62765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882610.62898: stderr chunk (state=3): >>><<< 30529 1726882610.62901: stdout chunk (state=3): >>><<< 30529 1726882610.62903: done transferring module to remote 30529 1726882610.62905: _low_level_execute_command(): starting 30529 1726882610.62908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/ /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py && sleep 0' 30529 1726882610.64357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882610.64360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882610.64363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.64365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882610.64366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882610.64372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.64708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.66371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882610.66405: stderr chunk (state=3): >>><<< 30529 1726882610.66414: stdout chunk (state=3): >>><<< 30529 1726882610.66437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882610.66446: _low_level_execute_command(): starting 30529 1726882610.66670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/AnsiballZ_stat.py && sleep 0' 30529 1726882610.67964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882610.68200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882610.68312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882610.68392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.83309: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882610.84639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882610.84643: stdout chunk (state=3): >>><<< 30529 1726882610.84646: stderr chunk (state=3): >>><<< 30529 1726882610.84825: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882610.84829: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882610.84832: _low_level_execute_command(): starting 30529 1726882610.84835: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882610.532338-31657-142192868475198/ > /dev/null 2>&1 && sleep 0' 30529 1726882610.86048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882610.86111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882610.86250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882610.86381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882610.86424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882610.86465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882610.88316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882610.88327: stdout chunk (state=3): >>><<< 30529 1726882610.88363: stderr chunk (state=3): >>><<< 30529 1726882610.88384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882610.88498: handler run complete 30529 1726882610.88502: attempt loop complete, returning result 30529 1726882610.88507: _execute() done 30529 1726882610.88514: dumping result to json 30529 1726882610.88522: done dumping result, returning 30529 1726882610.88536: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-000000000947] 30529 1726882610.88544: sending task result for task 12673a56-9f93-b0f1-edc0-000000000947 30529 1726882610.88860: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000947 30529 1726882610.88863: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882610.88954: no more pending results, returning what we have 30529 1726882610.88959: results queue empty 30529 1726882610.88960: checking for any_errors_fatal 30529 1726882610.88966: done checking for any_errors_fatal 30529 1726882610.88967: checking for max_fail_percentage 30529 1726882610.88969: done checking for max_fail_percentage 30529 1726882610.88970: checking to see if all hosts have failed and the running result is not ok 30529 1726882610.88971: done checking to see if all hosts have failed 30529 1726882610.88972: getting the remaining hosts for this loop 30529 1726882610.88973: done getting the remaining hosts for this loop 30529 1726882610.88978: getting the next task for host managed_node1 30529 1726882610.88987: done getting next task for host managed_node1 30529 1726882610.88995: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882610.89000: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882610.89004: getting variables 30529 1726882610.89006: in VariableManager get_vars() 30529 1726882610.89039: Calling all_inventory to load vars for managed_node1 30529 1726882610.89041: Calling groups_inventory to load vars for managed_node1 30529 1726882610.89045: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882610.89057: Calling all_plugins_play to load vars for managed_node1 30529 1726882610.89062: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882610.89065: Calling groups_plugins_play to load vars for managed_node1 30529 1726882610.93276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882610.98704: done with get_vars() 30529 1726882610.98729: done getting variables 30529 1726882610.99199: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:50 -0400 (0:00:00.564) 0:00:25.018 ****** 30529 1726882610.99236: entering _queue_task() for managed_node1/set_fact 30529 1726882611.00430: worker is 1 (out of 1 available) 30529 1726882611.00442: exiting _queue_task() for managed_node1/set_fact 30529 1726882611.00455: done queuing things up, now waiting for results queue to drain 30529 1726882611.00456: waiting for pending results... 30529 1726882611.00614: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882611.00959: in run() - task 12673a56-9f93-b0f1-edc0-000000000948 30529 1726882611.00963: variable 'ansible_search_path' from source: unknown 30529 1726882611.00965: variable 'ansible_search_path' from source: unknown 30529 1726882611.00967: calling self._execute() 30529 1726882611.01203: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.01213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.01229: variable 'omit' from source: magic vars 30529 1726882611.02158: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.02162: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.02260: variable 'profile_stat' from source: set_fact 30529 1726882611.02702: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882611.02706: when evaluation is False, skipping this task 30529 1726882611.02708: _execute() done 30529 1726882611.02710: dumping result to json 30529 1726882611.02712: done dumping result, returning 30529 1726882611.02715: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-000000000948] 30529 1726882611.02718: sending task result for task 12673a56-9f93-b0f1-edc0-000000000948 30529 1726882611.02791: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000948 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882611.02844: no more pending results, returning what we have 30529 1726882611.02850: results queue empty 30529 1726882611.02851: checking for any_errors_fatal 30529 1726882611.02862: done checking for any_errors_fatal 30529 1726882611.02863: checking for max_fail_percentage 30529 1726882611.02865: done checking for max_fail_percentage 30529 1726882611.02866: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.02867: done checking to see if all hosts have failed 30529 1726882611.02868: getting the remaining hosts for this loop 30529 1726882611.02870: done getting the remaining hosts for this loop 30529 1726882611.02874: getting the next task for host managed_node1 30529 1726882611.02883: done getting next task for host managed_node1 30529 1726882611.02885: ^ task is: TASK: Get NM profile info 30529 1726882611.02896: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.02900: getting variables 30529 1726882611.02902: in VariableManager get_vars() 30529 1726882611.02934: Calling all_inventory to load vars for managed_node1 30529 1726882611.02937: Calling groups_inventory to load vars for managed_node1 30529 1726882611.02941: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.02954: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.02957: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.02960: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.03801: WORKER PROCESS EXITING 30529 1726882611.06985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.11264: done with get_vars() 30529 1726882611.11302: done getting variables 30529 1726882611.11365: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:51 -0400 (0:00:00.121) 0:00:25.140 ****** 30529 1726882611.11418: entering _queue_task() for managed_node1/shell 30529 1726882611.12097: worker is 1 (out of 1 available) 30529 1726882611.12107: exiting _queue_task() for managed_node1/shell 30529 1726882611.12118: done queuing things up, now waiting for results queue to drain 30529 1726882611.12119: waiting for pending results... 30529 1726882611.12140: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882611.12224: in run() - task 12673a56-9f93-b0f1-edc0-000000000949 30529 1726882611.12242: variable 'ansible_search_path' from source: unknown 30529 1726882611.12245: variable 'ansible_search_path' from source: unknown 30529 1726882611.12291: calling self._execute() 30529 1726882611.12378: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.12397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.12405: variable 'omit' from source: magic vars 30529 1726882611.12970: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.12987: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.12994: variable 'omit' from source: magic vars 30529 1726882611.13044: variable 'omit' from source: magic vars 30529 1726882611.13511: variable 'profile' from source: play vars 30529 1726882611.13515: variable 'interface' from source: play vars 30529 1726882611.13799: variable 'interface' from source: play vars 30529 1726882611.13802: variable 'omit' from source: magic vars 30529 1726882611.13804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882611.13806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882611.13909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882611.13927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.13941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.13971: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882611.13974: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.13977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.14184: Set connection var ansible_shell_executable to /bin/sh 30529 1726882611.14191: Set connection var ansible_pipelining to False 30529 1726882611.14196: Set connection var ansible_shell_type to sh 30529 1726882611.14311: Set connection var ansible_timeout to 10 30529 1726882611.14339: Set connection var ansible_connection to ssh 30529 1726882611.14347: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882611.14369: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.14373: variable 'ansible_connection' from source: unknown 30529 1726882611.14375: variable 'ansible_module_compression' from source: unknown 30529 1726882611.14378: variable 'ansible_shell_type' from source: unknown 30529 1726882611.14380: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.14382: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.14384: variable 'ansible_pipelining' from source: unknown 30529 1726882611.14387: variable 'ansible_timeout' from source: unknown 30529 1726882611.14395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.14754: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882611.14765: variable 'omit' from source: magic vars 30529 1726882611.15000: starting attempt loop 30529 1726882611.15004: running the handler 30529 1726882611.15007: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882611.15010: _low_level_execute_command(): starting 30529 1726882611.15012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882611.15572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882611.15584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882611.15608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882611.15645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882611.15734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882611.15791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.15868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.17525: stdout chunk (state=3): >>>/root <<< 30529 1726882611.17892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882611.17909: stdout chunk (state=3): >>><<< 30529 1726882611.18215: stderr chunk (state=3): >>><<< 30529 1726882611.18220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882611.18223: _low_level_execute_command(): starting 30529 1726882611.18225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164 `" && echo ansible-tmp-1726882611.1812627-31677-55414431057164="` echo /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164 `" ) && sleep 0' 30529 1726882611.19308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882611.19371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882611.19451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882611.19473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882611.19502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.19699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.21586: stdout chunk (state=3): >>>ansible-tmp-1726882611.1812627-31677-55414431057164=/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164 <<< 30529 1726882611.21674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882611.21678: stdout chunk (state=3): >>><<< 30529 1726882611.21680: stderr chunk (state=3): >>><<< 30529 1726882611.21683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882611.1812627-31677-55414431057164=/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882611.22010: variable 'ansible_module_compression' from source: unknown 30529 1726882611.22013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882611.22015: variable 'ansible_facts' from source: unknown 30529 1726882611.22018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py 30529 1726882611.22285: Sending initial data 30529 1726882611.22339: Sent initial data (155 bytes) 30529 1726882611.23778: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882611.23938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.23971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.25566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882611.25626: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882611.25680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgb9lbv80 /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py <<< 30529 1726882611.25685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py" <<< 30529 1726882611.25746: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgb9lbv80" to remote "/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py" <<< 30529 1726882611.27328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882611.27332: stdout chunk (state=3): >>><<< 30529 1726882611.27334: stderr chunk (state=3): >>><<< 30529 1726882611.27336: done transferring module to remote 30529 1726882611.27338: _low_level_execute_command(): starting 30529 1726882611.27340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/ /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py && sleep 0' 30529 1726882611.28249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882611.28392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.28469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.30214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882611.30226: stdout chunk (state=3): >>><<< 30529 1726882611.30236: stderr chunk (state=3): >>><<< 30529 1726882611.30254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882611.30265: _low_level_execute_command(): starting 30529 1726882611.30279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/AnsiballZ_command.py && sleep 0' 30529 1726882611.30950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882611.30964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882611.31037: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882611.31084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882611.31105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882611.31126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.31207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.47695: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:36:51.459395", "end": "2024-09-20 21:36:51.475910", "delta": "0:00:00.016515", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882611.49177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882611.49201: stderr chunk (state=3): >>><<< 30529 1726882611.49205: stdout chunk (state=3): >>><<< 30529 1726882611.49227: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:36:51.459395", "end": "2024-09-20 21:36:51.475910", "delta": "0:00:00.016515", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882611.49256: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882611.49263: _low_level_execute_command(): starting 30529 1726882611.49268: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882611.1812627-31677-55414431057164/ > /dev/null 2>&1 && sleep 0' 30529 1726882611.49729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882611.49733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882611.49735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882611.49741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882611.49743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882611.49791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882611.49803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882611.49842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882611.51898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882611.51902: stdout chunk (state=3): >>><<< 30529 1726882611.51905: stderr chunk (state=3): >>><<< 30529 1726882611.51907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882611.51910: handler run complete 30529 1726882611.51912: Evaluated conditional (False): False 30529 1726882611.51914: attempt loop complete, returning result 30529 1726882611.51916: _execute() done 30529 1726882611.51917: dumping result to json 30529 1726882611.51919: done dumping result, returning 30529 1726882611.51921: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-000000000949] 30529 1726882611.51924: sending task result for task 12673a56-9f93-b0f1-edc0-000000000949 30529 1726882611.52001: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000949 30529 1726882611.52004: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016515", "end": "2024-09-20 21:36:51.475910", "rc": 0, "start": "2024-09-20 21:36:51.459395" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30529 1726882611.52070: no more pending results, returning what we have 30529 1726882611.52074: results queue empty 30529 1726882611.52075: checking for any_errors_fatal 30529 1726882611.52081: done checking for any_errors_fatal 30529 1726882611.52082: checking for max_fail_percentage 30529 1726882611.52084: done checking for max_fail_percentage 30529 1726882611.52085: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.52086: done checking to see if all hosts have failed 30529 1726882611.52087: getting the remaining hosts for this loop 30529 1726882611.52088: done getting the remaining hosts for this loop 30529 1726882611.52092: getting the next task for host managed_node1 30529 1726882611.52102: done getting next task for host managed_node1 30529 1726882611.52105: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882611.52109: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.52113: getting variables 30529 1726882611.52114: in VariableManager get_vars() 30529 1726882611.52145: Calling all_inventory to load vars for managed_node1 30529 1726882611.52148: Calling groups_inventory to load vars for managed_node1 30529 1726882611.52151: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.52161: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.52164: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.52167: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.53132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.54176: done with get_vars() 30529 1726882611.54191: done getting variables 30529 1726882611.54238: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:51 -0400 (0:00:00.428) 0:00:25.568 ****** 30529 1726882611.54265: entering _queue_task() for managed_node1/set_fact 30529 1726882611.54539: worker is 1 (out of 1 available) 30529 1726882611.54553: exiting _queue_task() for managed_node1/set_fact 30529 1726882611.54564: done queuing things up, now waiting for results queue to drain 30529 1726882611.54565: waiting for pending results... 30529 1726882611.54777: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882611.54886: in run() - task 12673a56-9f93-b0f1-edc0-00000000094a 30529 1726882611.54905: variable 'ansible_search_path' from source: unknown 30529 1726882611.54908: variable 'ansible_search_path' from source: unknown 30529 1726882611.54936: calling self._execute() 30529 1726882611.55013: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.55018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.55026: variable 'omit' from source: magic vars 30529 1726882611.55323: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.55340: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.55460: variable 'nm_profile_exists' from source: set_fact 30529 1726882611.55469: Evaluated conditional (nm_profile_exists.rc == 0): True 30529 1726882611.55481: variable 'omit' from source: magic vars 30529 1726882611.55519: variable 'omit' from source: magic vars 30529 1726882611.55558: variable 'omit' from source: magic vars 30529 1726882611.55598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882611.55627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882611.55652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882611.55676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.55680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.55711: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882611.55714: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.55717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.55799: Set connection var ansible_shell_executable to /bin/sh 30529 1726882611.55804: Set connection var ansible_pipelining to False 30529 1726882611.55807: Set connection var ansible_shell_type to sh 30529 1726882611.55817: Set connection var ansible_timeout to 10 30529 1726882611.55820: Set connection var ansible_connection to ssh 30529 1726882611.55824: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882611.55840: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.55843: variable 'ansible_connection' from source: unknown 30529 1726882611.55845: variable 'ansible_module_compression' from source: unknown 30529 1726882611.55848: variable 'ansible_shell_type' from source: unknown 30529 1726882611.55850: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.55852: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.55856: variable 'ansible_pipelining' from source: unknown 30529 1726882611.55858: variable 'ansible_timeout' from source: unknown 30529 1726882611.55861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.55963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882611.55971: variable 'omit' from source: magic vars 30529 1726882611.55976: starting attempt loop 30529 1726882611.55979: running the handler 30529 1726882611.55989: handler run complete 30529 1726882611.56003: attempt loop complete, returning result 30529 1726882611.56008: _execute() done 30529 1726882611.56012: dumping result to json 30529 1726882611.56014: done dumping result, returning 30529 1726882611.56018: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-00000000094a] 30529 1726882611.56020: sending task result for task 12673a56-9f93-b0f1-edc0-00000000094a 30529 1726882611.56121: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000094a 30529 1726882611.56131: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30529 1726882611.56183: no more pending results, returning what we have 30529 1726882611.56186: results queue empty 30529 1726882611.56187: checking for any_errors_fatal 30529 1726882611.56198: done checking for any_errors_fatal 30529 1726882611.56199: checking for max_fail_percentage 30529 1726882611.56200: done checking for max_fail_percentage 30529 1726882611.56201: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.56202: done checking to see if all hosts have failed 30529 1726882611.56202: getting the remaining hosts for this loop 30529 1726882611.56204: done getting the remaining hosts for this loop 30529 1726882611.56208: getting the next task for host managed_node1 30529 1726882611.56220: done getting next task for host managed_node1 30529 1726882611.56223: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882611.56228: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.56231: getting variables 30529 1726882611.56232: in VariableManager get_vars() 30529 1726882611.56259: Calling all_inventory to load vars for managed_node1 30529 1726882611.56261: Calling groups_inventory to load vars for managed_node1 30529 1726882611.56264: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.56273: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.56276: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.56278: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.57122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.58220: done with get_vars() 30529 1726882611.58239: done getting variables 30529 1726882611.58309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.58453: variable 'profile' from source: play vars 30529 1726882611.58458: variable 'interface' from source: play vars 30529 1726882611.58528: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:51 -0400 (0:00:00.042) 0:00:25.611 ****** 30529 1726882611.58559: entering _queue_task() for managed_node1/command 30529 1726882611.58835: worker is 1 (out of 1 available) 30529 1726882611.58848: exiting _queue_task() for managed_node1/command 30529 1726882611.58860: done queuing things up, now waiting for results queue to drain 30529 1726882611.58862: waiting for pending results... 30529 1726882611.59311: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882611.59316: in run() - task 12673a56-9f93-b0f1-edc0-00000000094c 30529 1726882611.59320: variable 'ansible_search_path' from source: unknown 30529 1726882611.59322: variable 'ansible_search_path' from source: unknown 30529 1726882611.59324: calling self._execute() 30529 1726882611.59405: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.59416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.59430: variable 'omit' from source: magic vars 30529 1726882611.59771: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.59784: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.59874: variable 'profile_stat' from source: set_fact 30529 1726882611.59885: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882611.59890: when evaluation is False, skipping this task 30529 1726882611.59899: _execute() done 30529 1726882611.59905: dumping result to json 30529 1726882611.59913: done dumping result, returning 30529 1726882611.59918: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000094c] 30529 1726882611.59925: sending task result for task 12673a56-9f93-b0f1-edc0-00000000094c skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882611.60065: no more pending results, returning what we have 30529 1726882611.60070: results queue empty 30529 1726882611.60071: checking for any_errors_fatal 30529 1726882611.60075: done checking for any_errors_fatal 30529 1726882611.60076: checking for max_fail_percentage 30529 1726882611.60077: done checking for max_fail_percentage 30529 1726882611.60078: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.60079: done checking to see if all hosts have failed 30529 1726882611.60080: getting the remaining hosts for this loop 30529 1726882611.60081: done getting the remaining hosts for this loop 30529 1726882611.60084: getting the next task for host managed_node1 30529 1726882611.60092: done getting next task for host managed_node1 30529 1726882611.60097: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882611.60103: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.60108: getting variables 30529 1726882611.60116: in VariableManager get_vars() 30529 1726882611.60149: Calling all_inventory to load vars for managed_node1 30529 1726882611.60151: Calling groups_inventory to load vars for managed_node1 30529 1726882611.60154: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.60164: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.60167: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.60170: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.60743: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000094c 30529 1726882611.60747: WORKER PROCESS EXITING 30529 1726882611.61750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.63014: done with get_vars() 30529 1726882611.63032: done getting variables 30529 1726882611.63094: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.63192: variable 'profile' from source: play vars 30529 1726882611.63199: variable 'interface' from source: play vars 30529 1726882611.63257: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:51 -0400 (0:00:00.047) 0:00:25.658 ****** 30529 1726882611.63291: entering _queue_task() for managed_node1/set_fact 30529 1726882611.63577: worker is 1 (out of 1 available) 30529 1726882611.63595: exiting _queue_task() for managed_node1/set_fact 30529 1726882611.63609: done queuing things up, now waiting for results queue to drain 30529 1726882611.63611: waiting for pending results... 30529 1726882611.64213: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882611.64219: in run() - task 12673a56-9f93-b0f1-edc0-00000000094d 30529 1726882611.64222: variable 'ansible_search_path' from source: unknown 30529 1726882611.64225: variable 'ansible_search_path' from source: unknown 30529 1726882611.64228: calling self._execute() 30529 1726882611.64313: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.64317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.64320: variable 'omit' from source: magic vars 30529 1726882611.64832: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.64835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.65068: variable 'profile_stat' from source: set_fact 30529 1726882611.65078: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882611.65081: when evaluation is False, skipping this task 30529 1726882611.65084: _execute() done 30529 1726882611.65086: dumping result to json 30529 1726882611.65088: done dumping result, returning 30529 1726882611.65127: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000094d] 30529 1726882611.65130: sending task result for task 12673a56-9f93-b0f1-edc0-00000000094d 30529 1726882611.65385: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000094d 30529 1726882611.65388: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882611.65437: no more pending results, returning what we have 30529 1726882611.65441: results queue empty 30529 1726882611.65442: checking for any_errors_fatal 30529 1726882611.65450: done checking for any_errors_fatal 30529 1726882611.65451: checking for max_fail_percentage 30529 1726882611.65453: done checking for max_fail_percentage 30529 1726882611.65454: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.65455: done checking to see if all hosts have failed 30529 1726882611.65455: getting the remaining hosts for this loop 30529 1726882611.65457: done getting the remaining hosts for this loop 30529 1726882611.65461: getting the next task for host managed_node1 30529 1726882611.65470: done getting next task for host managed_node1 30529 1726882611.65473: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882611.65479: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.65483: getting variables 30529 1726882611.65485: in VariableManager get_vars() 30529 1726882611.65521: Calling all_inventory to load vars for managed_node1 30529 1726882611.65523: Calling groups_inventory to load vars for managed_node1 30529 1726882611.65527: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.65542: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.65546: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.65549: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.67228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.68873: done with get_vars() 30529 1726882611.68897: done getting variables 30529 1726882611.69116: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.69231: variable 'profile' from source: play vars 30529 1726882611.69235: variable 'interface' from source: play vars 30529 1726882611.69403: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:51 -0400 (0:00:00.061) 0:00:25.720 ****** 30529 1726882611.69437: entering _queue_task() for managed_node1/command 30529 1726882611.69992: worker is 1 (out of 1 available) 30529 1726882611.70314: exiting _queue_task() for managed_node1/command 30529 1726882611.70327: done queuing things up, now waiting for results queue to drain 30529 1726882611.70329: waiting for pending results... 30529 1726882611.70815: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882611.71124: in run() - task 12673a56-9f93-b0f1-edc0-00000000094e 30529 1726882611.71129: variable 'ansible_search_path' from source: unknown 30529 1726882611.71132: variable 'ansible_search_path' from source: unknown 30529 1726882611.71157: calling self._execute() 30529 1726882611.71904: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.71907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.71910: variable 'omit' from source: magic vars 30529 1726882611.72674: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.72695: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.72943: variable 'profile_stat' from source: set_fact 30529 1726882611.73007: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882611.73015: when evaluation is False, skipping this task 30529 1726882611.73021: _execute() done 30529 1726882611.73028: dumping result to json 30529 1726882611.73035: done dumping result, returning 30529 1726882611.73050: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000094e] 30529 1726882611.73152: sending task result for task 12673a56-9f93-b0f1-edc0-00000000094e 30529 1726882611.73225: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000094e 30529 1726882611.73229: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882611.73309: no more pending results, returning what we have 30529 1726882611.73313: results queue empty 30529 1726882611.73315: checking for any_errors_fatal 30529 1726882611.73323: done checking for any_errors_fatal 30529 1726882611.73323: checking for max_fail_percentage 30529 1726882611.73325: done checking for max_fail_percentage 30529 1726882611.73326: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.73327: done checking to see if all hosts have failed 30529 1726882611.73328: getting the remaining hosts for this loop 30529 1726882611.73329: done getting the remaining hosts for this loop 30529 1726882611.73333: getting the next task for host managed_node1 30529 1726882611.73342: done getting next task for host managed_node1 30529 1726882611.73344: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882611.73349: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.73354: getting variables 30529 1726882611.73356: in VariableManager get_vars() 30529 1726882611.73385: Calling all_inventory to load vars for managed_node1 30529 1726882611.73387: Calling groups_inventory to load vars for managed_node1 30529 1726882611.73596: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.73614: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.73619: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.73623: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.80638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.82146: done with get_vars() 30529 1726882611.82169: done getting variables 30529 1726882611.82225: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.82328: variable 'profile' from source: play vars 30529 1726882611.82331: variable 'interface' from source: play vars 30529 1726882611.82395: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:51 -0400 (0:00:00.129) 0:00:25.850 ****** 30529 1726882611.82424: entering _queue_task() for managed_node1/set_fact 30529 1726882611.82792: worker is 1 (out of 1 available) 30529 1726882611.83006: exiting _queue_task() for managed_node1/set_fact 30529 1726882611.83016: done queuing things up, now waiting for results queue to drain 30529 1726882611.83018: waiting for pending results... 30529 1726882611.83156: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882611.83367: in run() - task 12673a56-9f93-b0f1-edc0-00000000094f 30529 1726882611.83372: variable 'ansible_search_path' from source: unknown 30529 1726882611.83374: variable 'ansible_search_path' from source: unknown 30529 1726882611.83377: calling self._execute() 30529 1726882611.83411: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.83417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.83428: variable 'omit' from source: magic vars 30529 1726882611.83803: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.83811: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.83934: variable 'profile_stat' from source: set_fact 30529 1726882611.83943: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882611.83946: when evaluation is False, skipping this task 30529 1726882611.83949: _execute() done 30529 1726882611.83953: dumping result to json 30529 1726882611.83955: done dumping result, returning 30529 1726882611.83963: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000094f] 30529 1726882611.83968: sending task result for task 12673a56-9f93-b0f1-edc0-00000000094f 30529 1726882611.84065: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000094f 30529 1726882611.84069: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882611.84149: no more pending results, returning what we have 30529 1726882611.84154: results queue empty 30529 1726882611.84155: checking for any_errors_fatal 30529 1726882611.84163: done checking for any_errors_fatal 30529 1726882611.84164: checking for max_fail_percentage 30529 1726882611.84166: done checking for max_fail_percentage 30529 1726882611.84167: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.84168: done checking to see if all hosts have failed 30529 1726882611.84168: getting the remaining hosts for this loop 30529 1726882611.84170: done getting the remaining hosts for this loop 30529 1726882611.84174: getting the next task for host managed_node1 30529 1726882611.84185: done getting next task for host managed_node1 30529 1726882611.84188: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30529 1726882611.84196: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.84203: getting variables 30529 1726882611.84205: in VariableManager get_vars() 30529 1726882611.84240: Calling all_inventory to load vars for managed_node1 30529 1726882611.84242: Calling groups_inventory to load vars for managed_node1 30529 1726882611.84246: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.84260: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.84264: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.84268: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.85712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.87419: done with get_vars() 30529 1726882611.87438: done getting variables 30529 1726882611.87497: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.87606: variable 'profile' from source: play vars 30529 1726882611.87610: variable 'interface' from source: play vars 30529 1726882611.87664: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:36:51 -0400 (0:00:00.052) 0:00:25.903 ****** 30529 1726882611.87699: entering _queue_task() for managed_node1/assert 30529 1726882611.87964: worker is 1 (out of 1 available) 30529 1726882611.87977: exiting _queue_task() for managed_node1/assert 30529 1726882611.87992: done queuing things up, now waiting for results queue to drain 30529 1726882611.88098: waiting for pending results... 30529 1726882611.88358: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' 30529 1726882611.88374: in run() - task 12673a56-9f93-b0f1-edc0-0000000008ae 30529 1726882611.88388: variable 'ansible_search_path' from source: unknown 30529 1726882611.88391: variable 'ansible_search_path' from source: unknown 30529 1726882611.88427: calling self._execute() 30529 1726882611.88774: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.88777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.88780: variable 'omit' from source: magic vars 30529 1726882611.88940: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.88952: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.88959: variable 'omit' from source: magic vars 30529 1726882611.89011: variable 'omit' from source: magic vars 30529 1726882611.89114: variable 'profile' from source: play vars 30529 1726882611.89118: variable 'interface' from source: play vars 30529 1726882611.89179: variable 'interface' from source: play vars 30529 1726882611.89201: variable 'omit' from source: magic vars 30529 1726882611.89240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882611.89275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882611.89300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882611.89316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.89328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.89359: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882611.89362: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.89364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.89474: Set connection var ansible_shell_executable to /bin/sh 30529 1726882611.89477: Set connection var ansible_pipelining to False 30529 1726882611.89480: Set connection var ansible_shell_type to sh 30529 1726882611.89490: Set connection var ansible_timeout to 10 30529 1726882611.89498: Set connection var ansible_connection to ssh 30529 1726882611.89503: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882611.89530: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.89534: variable 'ansible_connection' from source: unknown 30529 1726882611.89536: variable 'ansible_module_compression' from source: unknown 30529 1726882611.89538: variable 'ansible_shell_type' from source: unknown 30529 1726882611.89541: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.89543: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.89545: variable 'ansible_pipelining' from source: unknown 30529 1726882611.89548: variable 'ansible_timeout' from source: unknown 30529 1726882611.89553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.89696: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882611.89709: variable 'omit' from source: magic vars 30529 1726882611.89714: starting attempt loop 30529 1726882611.89717: running the handler 30529 1726882611.89822: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882611.89828: Evaluated conditional (lsr_net_profile_exists): True 30529 1726882611.89841: handler run complete 30529 1726882611.89852: attempt loop complete, returning result 30529 1726882611.89854: _execute() done 30529 1726882611.89857: dumping result to json 30529 1726882611.89859: done dumping result, returning 30529 1726882611.89867: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' [12673a56-9f93-b0f1-edc0-0000000008ae] 30529 1726882611.89872: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008ae ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882611.90022: no more pending results, returning what we have 30529 1726882611.90027: results queue empty 30529 1726882611.90028: checking for any_errors_fatal 30529 1726882611.90034: done checking for any_errors_fatal 30529 1726882611.90035: checking for max_fail_percentage 30529 1726882611.90037: done checking for max_fail_percentage 30529 1726882611.90038: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.90040: done checking to see if all hosts have failed 30529 1726882611.90041: getting the remaining hosts for this loop 30529 1726882611.90042: done getting the remaining hosts for this loop 30529 1726882611.90047: getting the next task for host managed_node1 30529 1726882611.90055: done getting next task for host managed_node1 30529 1726882611.90058: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30529 1726882611.90061: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.90066: getting variables 30529 1726882611.90069: in VariableManager get_vars() 30529 1726882611.90107: Calling all_inventory to load vars for managed_node1 30529 1726882611.90110: Calling groups_inventory to load vars for managed_node1 30529 1726882611.90114: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.90126: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.90131: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.90135: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.90711: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008ae 30529 1726882611.90715: WORKER PROCESS EXITING 30529 1726882611.92636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882611.94236: done with get_vars() 30529 1726882611.94256: done getting variables 30529 1726882611.94316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882611.94428: variable 'profile' from source: play vars 30529 1726882611.94431: variable 'interface' from source: play vars 30529 1726882611.94488: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:36:51 -0400 (0:00:00.068) 0:00:25.971 ****** 30529 1726882611.94526: entering _queue_task() for managed_node1/assert 30529 1726882611.94907: worker is 1 (out of 1 available) 30529 1726882611.94920: exiting _queue_task() for managed_node1/assert 30529 1726882611.94932: done queuing things up, now waiting for results queue to drain 30529 1726882611.94933: waiting for pending results... 30529 1726882611.95176: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' 30529 1726882611.95599: in run() - task 12673a56-9f93-b0f1-edc0-0000000008af 30529 1726882611.95604: variable 'ansible_search_path' from source: unknown 30529 1726882611.95607: variable 'ansible_search_path' from source: unknown 30529 1726882611.95609: calling self._execute() 30529 1726882611.95612: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.95614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.95617: variable 'omit' from source: magic vars 30529 1726882611.95959: variable 'ansible_distribution_major_version' from source: facts 30529 1726882611.95974: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882611.95983: variable 'omit' from source: magic vars 30529 1726882611.96037: variable 'omit' from source: magic vars 30529 1726882611.96140: variable 'profile' from source: play vars 30529 1726882611.96148: variable 'interface' from source: play vars 30529 1726882611.96388: variable 'interface' from source: play vars 30529 1726882611.96396: variable 'omit' from source: magic vars 30529 1726882611.96399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882611.96507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882611.96534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882611.96558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.96578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882611.96775: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882611.96786: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.96796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.96957: Set connection var ansible_shell_executable to /bin/sh 30529 1726882611.96970: Set connection var ansible_pipelining to False 30529 1726882611.96978: Set connection var ansible_shell_type to sh 30529 1726882611.97016: Set connection var ansible_timeout to 10 30529 1726882611.97025: Set connection var ansible_connection to ssh 30529 1726882611.97036: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882611.97068: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.97077: variable 'ansible_connection' from source: unknown 30529 1726882611.97084: variable 'ansible_module_compression' from source: unknown 30529 1726882611.97097: variable 'ansible_shell_type' from source: unknown 30529 1726882611.97105: variable 'ansible_shell_executable' from source: unknown 30529 1726882611.97112: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882611.97120: variable 'ansible_pipelining' from source: unknown 30529 1726882611.97127: variable 'ansible_timeout' from source: unknown 30529 1726882611.97136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882611.97316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882611.97319: variable 'omit' from source: magic vars 30529 1726882611.97322: starting attempt loop 30529 1726882611.97329: running the handler 30529 1726882611.97462: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30529 1726882611.97502: Evaluated conditional (lsr_net_profile_ansible_managed): True 30529 1726882611.97505: handler run complete 30529 1726882611.97508: attempt loop complete, returning result 30529 1726882611.97512: _execute() done 30529 1726882611.97519: dumping result to json 30529 1726882611.97531: done dumping result, returning 30529 1726882611.97583: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' [12673a56-9f93-b0f1-edc0-0000000008af] 30529 1726882611.97586: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008af ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882611.98187: no more pending results, returning what we have 30529 1726882611.98195: results queue empty 30529 1726882611.98196: checking for any_errors_fatal 30529 1726882611.98201: done checking for any_errors_fatal 30529 1726882611.98202: checking for max_fail_percentage 30529 1726882611.98204: done checking for max_fail_percentage 30529 1726882611.98204: checking to see if all hosts have failed and the running result is not ok 30529 1726882611.98205: done checking to see if all hosts have failed 30529 1726882611.98206: getting the remaining hosts for this loop 30529 1726882611.98208: done getting the remaining hosts for this loop 30529 1726882611.98211: getting the next task for host managed_node1 30529 1726882611.98217: done getting next task for host managed_node1 30529 1726882611.98220: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30529 1726882611.98223: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882611.98226: getting variables 30529 1726882611.98228: in VariableManager get_vars() 30529 1726882611.98254: Calling all_inventory to load vars for managed_node1 30529 1726882611.98257: Calling groups_inventory to load vars for managed_node1 30529 1726882611.98260: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882611.98270: Calling all_plugins_play to load vars for managed_node1 30529 1726882611.98273: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882611.98276: Calling groups_plugins_play to load vars for managed_node1 30529 1726882611.98907: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008af 30529 1726882611.98911: WORKER PROCESS EXITING 30529 1726882611.99774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.01322: done with get_vars() 30529 1726882612.01342: done getting variables 30529 1726882612.01402: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882612.01515: variable 'profile' from source: play vars 30529 1726882612.01519: variable 'interface' from source: play vars 30529 1726882612.01582: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:36:52 -0400 (0:00:00.070) 0:00:26.042 ****** 30529 1726882612.01619: entering _queue_task() for managed_node1/assert 30529 1726882612.02127: worker is 1 (out of 1 available) 30529 1726882612.02136: exiting _queue_task() for managed_node1/assert 30529 1726882612.02146: done queuing things up, now waiting for results queue to drain 30529 1726882612.02148: waiting for pending results... 30529 1726882612.02246: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr 30529 1726882612.02379: in run() - task 12673a56-9f93-b0f1-edc0-0000000008b0 30529 1726882612.02405: variable 'ansible_search_path' from source: unknown 30529 1726882612.02413: variable 'ansible_search_path' from source: unknown 30529 1726882612.02452: calling self._execute() 30529 1726882612.02549: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.02592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.02597: variable 'omit' from source: magic vars 30529 1726882612.02957: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.02975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.02987: variable 'omit' from source: magic vars 30529 1726882612.03134: variable 'omit' from source: magic vars 30529 1726882612.03147: variable 'profile' from source: play vars 30529 1726882612.03156: variable 'interface' from source: play vars 30529 1726882612.03226: variable 'interface' from source: play vars 30529 1726882612.03255: variable 'omit' from source: magic vars 30529 1726882612.03301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882612.03338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882612.03367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882612.03397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.03416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.03447: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882612.03459: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.03467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.03575: Set connection var ansible_shell_executable to /bin/sh 30529 1726882612.03585: Set connection var ansible_pipelining to False 30529 1726882612.03596: Set connection var ansible_shell_type to sh 30529 1726882612.03611: Set connection var ansible_timeout to 10 30529 1726882612.03677: Set connection var ansible_connection to ssh 30529 1726882612.03680: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882612.03682: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.03684: variable 'ansible_connection' from source: unknown 30529 1726882612.03686: variable 'ansible_module_compression' from source: unknown 30529 1726882612.03688: variable 'ansible_shell_type' from source: unknown 30529 1726882612.03694: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.03696: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.03698: variable 'ansible_pipelining' from source: unknown 30529 1726882612.03701: variable 'ansible_timeout' from source: unknown 30529 1726882612.03703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.03835: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882612.03852: variable 'omit' from source: magic vars 30529 1726882612.03863: starting attempt loop 30529 1726882612.03869: running the handler 30529 1726882612.03981: variable 'lsr_net_profile_fingerprint' from source: set_fact 30529 1726882612.03996: Evaluated conditional (lsr_net_profile_fingerprint): True 30529 1726882612.04114: handler run complete 30529 1726882612.04118: attempt loop complete, returning result 30529 1726882612.04120: _execute() done 30529 1726882612.04122: dumping result to json 30529 1726882612.04124: done dumping result, returning 30529 1726882612.04126: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr [12673a56-9f93-b0f1-edc0-0000000008b0] 30529 1726882612.04128: sending task result for task 12673a56-9f93-b0f1-edc0-0000000008b0 30529 1726882612.04198: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000008b0 30529 1726882612.04202: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882612.04264: no more pending results, returning what we have 30529 1726882612.04268: results queue empty 30529 1726882612.04269: checking for any_errors_fatal 30529 1726882612.04277: done checking for any_errors_fatal 30529 1726882612.04278: checking for max_fail_percentage 30529 1726882612.04280: done checking for max_fail_percentage 30529 1726882612.04281: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.04282: done checking to see if all hosts have failed 30529 1726882612.04283: getting the remaining hosts for this loop 30529 1726882612.04285: done getting the remaining hosts for this loop 30529 1726882612.04291: getting the next task for host managed_node1 30529 1726882612.04303: done getting next task for host managed_node1 30529 1726882612.04307: ^ task is: TASK: Conditional asserts 30529 1726882612.04310: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.04316: getting variables 30529 1726882612.04318: in VariableManager get_vars() 30529 1726882612.04348: Calling all_inventory to load vars for managed_node1 30529 1726882612.04351: Calling groups_inventory to load vars for managed_node1 30529 1726882612.04355: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.04367: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.04370: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.04373: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.06754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.09899: done with get_vars() 30529 1726882612.09929: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:36:52 -0400 (0:00:00.084) 0:00:26.126 ****** 30529 1726882612.10054: entering _queue_task() for managed_node1/include_tasks 30529 1726882612.10447: worker is 1 (out of 1 available) 30529 1726882612.10464: exiting _queue_task() for managed_node1/include_tasks 30529 1726882612.10478: done queuing things up, now waiting for results queue to drain 30529 1726882612.10479: waiting for pending results... 30529 1726882612.10813: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882612.10868: in run() - task 12673a56-9f93-b0f1-edc0-0000000005ba 30529 1726882612.10886: variable 'ansible_search_path' from source: unknown 30529 1726882612.10899: variable 'ansible_search_path' from source: unknown 30529 1726882612.11192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882612.14096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882612.14162: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882612.14211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882612.14250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882612.14284: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882612.14407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882612.14415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882612.14447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882612.14492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882612.14518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882612.14738: dumping result to json 30529 1726882612.14743: done dumping result, returning 30529 1726882612.14749: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-0000000005ba] 30529 1726882612.14754: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005ba 30529 1726882612.15057: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005ba 30529 1726882612.15061: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } 30529 1726882612.15125: no more pending results, returning what we have 30529 1726882612.15129: results queue empty 30529 1726882612.15130: checking for any_errors_fatal 30529 1726882612.15136: done checking for any_errors_fatal 30529 1726882612.15137: checking for max_fail_percentage 30529 1726882612.15139: done checking for max_fail_percentage 30529 1726882612.15140: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.15141: done checking to see if all hosts have failed 30529 1726882612.15142: getting the remaining hosts for this loop 30529 1726882612.15144: done getting the remaining hosts for this loop 30529 1726882612.15148: getting the next task for host managed_node1 30529 1726882612.15156: done getting next task for host managed_node1 30529 1726882612.15159: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882612.15162: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.15166: getting variables 30529 1726882612.15168: in VariableManager get_vars() 30529 1726882612.15206: Calling all_inventory to load vars for managed_node1 30529 1726882612.15209: Calling groups_inventory to load vars for managed_node1 30529 1726882612.15213: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.15226: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.15230: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.15233: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.18278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.20695: done with get_vars() 30529 1726882612.20719: done getting variables 30529 1726882612.20810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882612.20928: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:36:52 -0400 (0:00:00.109) 0:00:26.235 ****** 30529 1726882612.20958: entering _queue_task() for managed_node1/debug 30529 1726882612.21313: worker is 1 (out of 1 available) 30529 1726882612.21323: exiting _queue_task() for managed_node1/debug 30529 1726882612.21334: done queuing things up, now waiting for results queue to drain 30529 1726882612.21335: waiting for pending results... 30529 1726882612.21716: running TaskExecutor() for managed_node1/TASK: Success in test 'I can create a profile without autoconnect' 30529 1726882612.21743: in run() - task 12673a56-9f93-b0f1-edc0-0000000005bb 30529 1726882612.21763: variable 'ansible_search_path' from source: unknown 30529 1726882612.21770: variable 'ansible_search_path' from source: unknown 30529 1726882612.21821: calling self._execute() 30529 1726882612.21935: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.21948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.21974: variable 'omit' from source: magic vars 30529 1726882612.23147: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.23168: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.23203: variable 'omit' from source: magic vars 30529 1726882612.23263: variable 'omit' from source: magic vars 30529 1726882612.23373: variable 'lsr_description' from source: include params 30529 1726882612.23427: variable 'omit' from source: magic vars 30529 1726882612.23459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882612.23514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882612.23606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882612.23614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.23617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.23644: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882612.23648: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.23650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.23834: Set connection var ansible_shell_executable to /bin/sh 30529 1726882612.23844: Set connection var ansible_pipelining to False 30529 1726882612.23847: Set connection var ansible_shell_type to sh 30529 1726882612.23860: Set connection var ansible_timeout to 10 30529 1726882612.23862: Set connection var ansible_connection to ssh 30529 1726882612.23867: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882612.23904: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.23908: variable 'ansible_connection' from source: unknown 30529 1726882612.23911: variable 'ansible_module_compression' from source: unknown 30529 1726882612.23916: variable 'ansible_shell_type' from source: unknown 30529 1726882612.23919: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.23922: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.23926: variable 'ansible_pipelining' from source: unknown 30529 1726882612.23929: variable 'ansible_timeout' from source: unknown 30529 1726882612.23931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.24085: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882612.24104: variable 'omit' from source: magic vars 30529 1726882612.24109: starting attempt loop 30529 1726882612.24112: running the handler 30529 1726882612.24186: handler run complete 30529 1726882612.24190: attempt loop complete, returning result 30529 1726882612.24192: _execute() done 30529 1726882612.24196: dumping result to json 30529 1726882612.24198: done dumping result, returning 30529 1726882612.24201: done running TaskExecutor() for managed_node1/TASK: Success in test 'I can create a profile without autoconnect' [12673a56-9f93-b0f1-edc0-0000000005bb] 30529 1726882612.24203: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005bb 30529 1726882612.24353: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005bb 30529 1726882612.24356: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 30529 1726882612.24439: no more pending results, returning what we have 30529 1726882612.24442: results queue empty 30529 1726882612.24443: checking for any_errors_fatal 30529 1726882612.24448: done checking for any_errors_fatal 30529 1726882612.24449: checking for max_fail_percentage 30529 1726882612.24450: done checking for max_fail_percentage 30529 1726882612.24451: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.24452: done checking to see if all hosts have failed 30529 1726882612.24453: getting the remaining hosts for this loop 30529 1726882612.24454: done getting the remaining hosts for this loop 30529 1726882612.24458: getting the next task for host managed_node1 30529 1726882612.24466: done getting next task for host managed_node1 30529 1726882612.24469: ^ task is: TASK: Cleanup 30529 1726882612.24471: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.24476: getting variables 30529 1726882612.24478: in VariableManager get_vars() 30529 1726882612.24506: Calling all_inventory to load vars for managed_node1 30529 1726882612.24508: Calling groups_inventory to load vars for managed_node1 30529 1726882612.24511: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.24521: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.24523: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.24526: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.27183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.28861: done with get_vars() 30529 1726882612.28887: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:36:52 -0400 (0:00:00.080) 0:00:26.315 ****** 30529 1726882612.28975: entering _queue_task() for managed_node1/include_tasks 30529 1726882612.29527: worker is 1 (out of 1 available) 30529 1726882612.29537: exiting _queue_task() for managed_node1/include_tasks 30529 1726882612.29548: done queuing things up, now waiting for results queue to drain 30529 1726882612.29550: waiting for pending results... 30529 1726882612.29678: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882612.29796: in run() - task 12673a56-9f93-b0f1-edc0-0000000005bf 30529 1726882612.29892: variable 'ansible_search_path' from source: unknown 30529 1726882612.29897: variable 'ansible_search_path' from source: unknown 30529 1726882612.29900: variable 'lsr_cleanup' from source: include params 30529 1726882612.30076: variable 'lsr_cleanup' from source: include params 30529 1726882612.30170: variable 'omit' from source: magic vars 30529 1726882612.30331: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.30346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.30365: variable 'omit' from source: magic vars 30529 1726882612.30614: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.30629: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.30641: variable 'item' from source: unknown 30529 1726882612.30716: variable 'item' from source: unknown 30529 1726882612.30760: variable 'item' from source: unknown 30529 1726882612.30868: variable 'item' from source: unknown 30529 1726882612.30979: dumping result to json 30529 1726882612.30982: done dumping result, returning 30529 1726882612.30985: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-0000000005bf] 30529 1726882612.30987: sending task result for task 12673a56-9f93-b0f1-edc0-0000000005bf 30529 1726882612.31245: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000005bf 30529 1726882612.31249: WORKER PROCESS EXITING 30529 1726882612.31327: no more pending results, returning what we have 30529 1726882612.31331: in VariableManager get_vars() 30529 1726882612.31364: Calling all_inventory to load vars for managed_node1 30529 1726882612.31367: Calling groups_inventory to load vars for managed_node1 30529 1726882612.31370: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.31380: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.31383: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.31386: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.33003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.34607: done with get_vars() 30529 1726882612.34625: variable 'ansible_search_path' from source: unknown 30529 1726882612.34627: variable 'ansible_search_path' from source: unknown 30529 1726882612.34666: we have included files to process 30529 1726882612.34667: generating all_blocks data 30529 1726882612.34669: done generating all_blocks data 30529 1726882612.34676: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882612.34677: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882612.34679: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882612.34889: done processing included file 30529 1726882612.34891: iterating over new_blocks loaded from include file 30529 1726882612.34894: in VariableManager get_vars() 30529 1726882612.34919: done with get_vars() 30529 1726882612.34921: filtering new block on tags 30529 1726882612.34947: done filtering new block on tags 30529 1726882612.34949: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882612.34954: extending task lists for all hosts with included blocks 30529 1726882612.36463: done extending task lists 30529 1726882612.36465: done processing included files 30529 1726882612.36466: results queue empty 30529 1726882612.36466: checking for any_errors_fatal 30529 1726882612.36469: done checking for any_errors_fatal 30529 1726882612.36470: checking for max_fail_percentage 30529 1726882612.36471: done checking for max_fail_percentage 30529 1726882612.36472: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.36473: done checking to see if all hosts have failed 30529 1726882612.36474: getting the remaining hosts for this loop 30529 1726882612.36475: done getting the remaining hosts for this loop 30529 1726882612.36478: getting the next task for host managed_node1 30529 1726882612.36483: done getting next task for host managed_node1 30529 1726882612.36485: ^ task is: TASK: Cleanup profile and device 30529 1726882612.36488: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.36490: getting variables 30529 1726882612.36491: in VariableManager get_vars() 30529 1726882612.36503: Calling all_inventory to load vars for managed_node1 30529 1726882612.36505: Calling groups_inventory to load vars for managed_node1 30529 1726882612.36507: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.36519: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.36522: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.36525: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.37763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.39303: done with get_vars() 30529 1726882612.39323: done getting variables 30529 1726882612.39374: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:36:52 -0400 (0:00:00.104) 0:00:26.420 ****** 30529 1726882612.39407: entering _queue_task() for managed_node1/shell 30529 1726882612.39775: worker is 1 (out of 1 available) 30529 1726882612.39787: exiting _queue_task() for managed_node1/shell 30529 1726882612.39801: done queuing things up, now waiting for results queue to drain 30529 1726882612.39803: waiting for pending results... 30529 1726882612.40117: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882612.40201: in run() - task 12673a56-9f93-b0f1-edc0-0000000009a0 30529 1726882612.40205: variable 'ansible_search_path' from source: unknown 30529 1726882612.40208: variable 'ansible_search_path' from source: unknown 30529 1726882612.40240: calling self._execute() 30529 1726882612.40325: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.40330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.40333: variable 'omit' from source: magic vars 30529 1726882612.40629: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.40639: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.40647: variable 'omit' from source: magic vars 30529 1726882612.40686: variable 'omit' from source: magic vars 30529 1726882612.40792: variable 'interface' from source: play vars 30529 1726882612.40809: variable 'omit' from source: magic vars 30529 1726882612.40842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882612.40871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882612.40886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882612.40909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.40919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.40941: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882612.40944: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.40947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.41024: Set connection var ansible_shell_executable to /bin/sh 30529 1726882612.41027: Set connection var ansible_pipelining to False 30529 1726882612.41030: Set connection var ansible_shell_type to sh 30529 1726882612.41038: Set connection var ansible_timeout to 10 30529 1726882612.41041: Set connection var ansible_connection to ssh 30529 1726882612.41045: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882612.41062: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.41065: variable 'ansible_connection' from source: unknown 30529 1726882612.41068: variable 'ansible_module_compression' from source: unknown 30529 1726882612.41070: variable 'ansible_shell_type' from source: unknown 30529 1726882612.41073: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.41075: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.41077: variable 'ansible_pipelining' from source: unknown 30529 1726882612.41080: variable 'ansible_timeout' from source: unknown 30529 1726882612.41084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.41181: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882612.41194: variable 'omit' from source: magic vars 30529 1726882612.41197: starting attempt loop 30529 1726882612.41200: running the handler 30529 1726882612.41208: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882612.41228: _low_level_execute_command(): starting 30529 1726882612.41235: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882612.41737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882612.41740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882612.41743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882612.41746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882612.41748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882612.41786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882612.41805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.41865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.43551: stdout chunk (state=3): >>>/root <<< 30529 1726882612.43825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882612.43829: stdout chunk (state=3): >>><<< 30529 1726882612.43831: stderr chunk (state=3): >>><<< 30529 1726882612.43835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882612.43840: _low_level_execute_command(): starting 30529 1726882612.43843: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612 `" && echo ansible-tmp-1726882612.4373312-31746-126071504341612="` echo /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612 `" ) && sleep 0' 30529 1726882612.44413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882612.44439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882612.44496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882612.44530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.44573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.46513: stdout chunk (state=3): >>>ansible-tmp-1726882612.4373312-31746-126071504341612=/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612 <<< 30529 1726882612.46828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882612.46831: stdout chunk (state=3): >>><<< 30529 1726882612.46834: stderr chunk (state=3): >>><<< 30529 1726882612.46838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.4373312-31746-126071504341612=/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882612.46841: variable 'ansible_module_compression' from source: unknown 30529 1726882612.46962: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882612.47058: variable 'ansible_facts' from source: unknown 30529 1726882612.47301: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py 30529 1726882612.47532: Sending initial data 30529 1726882612.47540: Sent initial data (156 bytes) 30529 1726882612.48095: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882612.48125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882612.48180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882612.48245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882612.48262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882612.48280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.48360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.49878: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882612.49944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882612.50012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppzaaxyst /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py <<< 30529 1726882612.50017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py" <<< 30529 1726882612.50049: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppzaaxyst" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py" <<< 30529 1726882612.51123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882612.51126: stderr chunk (state=3): >>><<< 30529 1726882612.51128: stdout chunk (state=3): >>><<< 30529 1726882612.51175: done transferring module to remote 30529 1726882612.51187: _low_level_execute_command(): starting 30529 1726882612.51232: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/ /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py && sleep 0' 30529 1726882612.52799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882612.52854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882612.52885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.52961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.54676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882612.54723: stderr chunk (state=3): >>><<< 30529 1726882612.54735: stdout chunk (state=3): >>><<< 30529 1726882612.54762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882612.54771: _low_level_execute_command(): starting 30529 1726882612.54780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/AnsiballZ_command.py && sleep 0' 30529 1726882612.55339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882612.55353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882612.55366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882612.55412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882612.55424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882612.55504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882612.55524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.55604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.73918: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (a53fa9d7-87cd-4e9e-bb37-caaa5cc02140) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:36:52.704485", "end": "2024-09-20 21:36:52.736132", "delta": "0:00:00.031647", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882612.75079: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30529 1726882612.75098: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882612.75158: stderr chunk (state=3): >>><<< 30529 1726882612.75170: stdout chunk (state=3): >>><<< 30529 1726882612.75203: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (a53fa9d7-87cd-4e9e-bb37-caaa5cc02140) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:36:52.704485", "end": "2024-09-20 21:36:52.736132", "delta": "0:00:00.031647", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882612.75265: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882612.75280: _low_level_execute_command(): starting 30529 1726882612.75301: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.4373312-31746-126071504341612/ > /dev/null 2>&1 && sleep 0' 30529 1726882612.76402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882612.76406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882612.76409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882612.76411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882612.76413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882612.76415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882612.76501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882612.76631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882612.76910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882612.78898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882612.78902: stdout chunk (state=3): >>><<< 30529 1726882612.78904: stderr chunk (state=3): >>><<< 30529 1726882612.78907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882612.78908: handler run complete 30529 1726882612.78910: Evaluated conditional (False): False 30529 1726882612.78912: attempt loop complete, returning result 30529 1726882612.78913: _execute() done 30529 1726882612.78915: dumping result to json 30529 1726882612.78917: done dumping result, returning 30529 1726882612.78919: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-0000000009a0] 30529 1726882612.78920: sending task result for task 12673a56-9f93-b0f1-edc0-0000000009a0 30529 1726882612.79001: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000009a0 30529 1726882612.79006: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.031647", "end": "2024-09-20 21:36:52.736132", "rc": 1, "start": "2024-09-20 21:36:52.704485" } STDOUT: Connection 'statebr' (a53fa9d7-87cd-4e9e-bb37-caaa5cc02140) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882612.79312: no more pending results, returning what we have 30529 1726882612.79316: results queue empty 30529 1726882612.79317: checking for any_errors_fatal 30529 1726882612.79318: done checking for any_errors_fatal 30529 1726882612.79319: checking for max_fail_percentage 30529 1726882612.79321: done checking for max_fail_percentage 30529 1726882612.79322: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.79323: done checking to see if all hosts have failed 30529 1726882612.79323: getting the remaining hosts for this loop 30529 1726882612.79325: done getting the remaining hosts for this loop 30529 1726882612.79329: getting the next task for host managed_node1 30529 1726882612.79339: done getting next task for host managed_node1 30529 1726882612.79344: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882612.79347: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.79351: getting variables 30529 1726882612.79353: in VariableManager get_vars() 30529 1726882612.79392: Calling all_inventory to load vars for managed_node1 30529 1726882612.79768: Calling groups_inventory to load vars for managed_node1 30529 1726882612.79777: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.79788: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.79792: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.79797: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.84006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.86530: done with get_vars() 30529 1726882612.86604: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 21:36:52 -0400 (0:00:00.472) 0:00:26.893 ****** 30529 1726882612.86707: entering _queue_task() for managed_node1/include_tasks 30529 1726882612.87271: worker is 1 (out of 1 available) 30529 1726882612.87283: exiting _queue_task() for managed_node1/include_tasks 30529 1726882612.87297: done queuing things up, now waiting for results queue to drain 30529 1726882612.87298: waiting for pending results... 30529 1726882612.87633: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882612.87853: in run() - task 12673a56-9f93-b0f1-edc0-000000000011 30529 1726882612.87866: variable 'ansible_search_path' from source: unknown 30529 1726882612.87933: calling self._execute() 30529 1726882612.88062: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.88066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.88078: variable 'omit' from source: magic vars 30529 1726882612.88774: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.88826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.88831: _execute() done 30529 1726882612.88835: dumping result to json 30529 1726882612.88837: done dumping result, returning 30529 1726882612.88847: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-000000000011] 30529 1726882612.88857: sending task result for task 12673a56-9f93-b0f1-edc0-000000000011 30529 1726882612.89062: no more pending results, returning what we have 30529 1726882612.89068: in VariableManager get_vars() 30529 1726882612.89107: Calling all_inventory to load vars for managed_node1 30529 1726882612.89110: Calling groups_inventory to load vars for managed_node1 30529 1726882612.89115: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.89131: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.89135: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.89138: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.89916: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000011 30529 1726882612.89920: WORKER PROCESS EXITING 30529 1726882612.91232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.92141: done with get_vars() 30529 1726882612.92154: variable 'ansible_search_path' from source: unknown 30529 1726882612.92163: we have included files to process 30529 1726882612.92164: generating all_blocks data 30529 1726882612.92165: done generating all_blocks data 30529 1726882612.92169: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882612.92169: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882612.92171: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882612.92449: in VariableManager get_vars() 30529 1726882612.92467: done with get_vars() 30529 1726882612.92514: in VariableManager get_vars() 30529 1726882612.92526: done with get_vars() 30529 1726882612.92566: in VariableManager get_vars() 30529 1726882612.92585: done with get_vars() 30529 1726882612.92619: in VariableManager get_vars() 30529 1726882612.92629: done with get_vars() 30529 1726882612.92653: in VariableManager get_vars() 30529 1726882612.92662: done with get_vars() 30529 1726882612.92918: in VariableManager get_vars() 30529 1726882612.92928: done with get_vars() 30529 1726882612.92935: done processing included file 30529 1726882612.92936: iterating over new_blocks loaded from include file 30529 1726882612.92937: in VariableManager get_vars() 30529 1726882612.92943: done with get_vars() 30529 1726882612.92944: filtering new block on tags 30529 1726882612.93006: done filtering new block on tags 30529 1726882612.93009: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882612.93014: extending task lists for all hosts with included blocks 30529 1726882612.93035: done extending task lists 30529 1726882612.93036: done processing included files 30529 1726882612.93037: results queue empty 30529 1726882612.93037: checking for any_errors_fatal 30529 1726882612.93040: done checking for any_errors_fatal 30529 1726882612.93040: checking for max_fail_percentage 30529 1726882612.93041: done checking for max_fail_percentage 30529 1726882612.93042: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.93042: done checking to see if all hosts have failed 30529 1726882612.93043: getting the remaining hosts for this loop 30529 1726882612.93044: done getting the remaining hosts for this loop 30529 1726882612.93045: getting the next task for host managed_node1 30529 1726882612.93048: done getting next task for host managed_node1 30529 1726882612.93049: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882612.93050: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.93052: getting variables 30529 1726882612.93053: in VariableManager get_vars() 30529 1726882612.93058: Calling all_inventory to load vars for managed_node1 30529 1726882612.93059: Calling groups_inventory to load vars for managed_node1 30529 1726882612.93061: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.93064: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.93065: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.93067: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.93699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882612.95180: done with get_vars() 30529 1726882612.95214: done getting variables 30529 1726882612.95272: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882612.95422: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:36:52 -0400 (0:00:00.087) 0:00:26.980 ****** 30529 1726882612.95462: entering _queue_task() for managed_node1/debug 30529 1726882612.95862: worker is 1 (out of 1 available) 30529 1726882612.95879: exiting _queue_task() for managed_node1/debug 30529 1726882612.96095: done queuing things up, now waiting for results queue to drain 30529 1726882612.96098: waiting for pending results... 30529 1726882612.96314: running TaskExecutor() for managed_node1/TASK: TEST: I can activate an existing profile 30529 1726882612.96361: in run() - task 12673a56-9f93-b0f1-edc0-000000000a49 30529 1726882612.96385: variable 'ansible_search_path' from source: unknown 30529 1726882612.96396: variable 'ansible_search_path' from source: unknown 30529 1726882612.96443: calling self._execute() 30529 1726882612.96532: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.96551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.96577: variable 'omit' from source: magic vars 30529 1726882612.97005: variable 'ansible_distribution_major_version' from source: facts 30529 1726882612.97086: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882612.97089: variable 'omit' from source: magic vars 30529 1726882612.97091: variable 'omit' from source: magic vars 30529 1726882612.97167: variable 'lsr_description' from source: include params 30529 1726882612.97191: variable 'omit' from source: magic vars 30529 1726882612.97241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882612.97280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882612.97600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882612.97603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.97605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882612.97607: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882612.97609: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.97610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.97688: Set connection var ansible_shell_executable to /bin/sh 30529 1726882612.97700: Set connection var ansible_pipelining to False 30529 1726882612.97710: Set connection var ansible_shell_type to sh 30529 1726882612.97724: Set connection var ansible_timeout to 10 30529 1726882612.97903: Set connection var ansible_connection to ssh 30529 1726882612.97907: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882612.97909: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.97912: variable 'ansible_connection' from source: unknown 30529 1726882612.97914: variable 'ansible_module_compression' from source: unknown 30529 1726882612.97916: variable 'ansible_shell_type' from source: unknown 30529 1726882612.97918: variable 'ansible_shell_executable' from source: unknown 30529 1726882612.97924: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882612.97926: variable 'ansible_pipelining' from source: unknown 30529 1726882612.97928: variable 'ansible_timeout' from source: unknown 30529 1726882612.97929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882612.98120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882612.98139: variable 'omit' from source: magic vars 30529 1726882612.98145: starting attempt loop 30529 1726882612.98148: running the handler 30529 1726882612.98207: handler run complete 30529 1726882612.98211: attempt loop complete, returning result 30529 1726882612.98213: _execute() done 30529 1726882612.98216: dumping result to json 30529 1726882612.98220: done dumping result, returning 30529 1726882612.98227: done running TaskExecutor() for managed_node1/TASK: TEST: I can activate an existing profile [12673a56-9f93-b0f1-edc0-000000000a49] 30529 1726882612.98243: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a49 30529 1726882612.98342: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a49 30529 1726882612.98345: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I can activate an existing profile ########## 30529 1726882612.98404: no more pending results, returning what we have 30529 1726882612.98407: results queue empty 30529 1726882612.98408: checking for any_errors_fatal 30529 1726882612.98409: done checking for any_errors_fatal 30529 1726882612.98410: checking for max_fail_percentage 30529 1726882612.98412: done checking for max_fail_percentage 30529 1726882612.98413: checking to see if all hosts have failed and the running result is not ok 30529 1726882612.98413: done checking to see if all hosts have failed 30529 1726882612.98414: getting the remaining hosts for this loop 30529 1726882612.98416: done getting the remaining hosts for this loop 30529 1726882612.98420: getting the next task for host managed_node1 30529 1726882612.98427: done getting next task for host managed_node1 30529 1726882612.98430: ^ task is: TASK: Show item 30529 1726882612.98432: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882612.98436: getting variables 30529 1726882612.98438: in VariableManager get_vars() 30529 1726882612.98471: Calling all_inventory to load vars for managed_node1 30529 1726882612.98474: Calling groups_inventory to load vars for managed_node1 30529 1726882612.98477: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882612.98488: Calling all_plugins_play to load vars for managed_node1 30529 1726882612.98491: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882612.98496: Calling groups_plugins_play to load vars for managed_node1 30529 1726882612.99742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.00608: done with get_vars() 30529 1726882613.00624: done getting variables 30529 1726882613.00665: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:36:53 -0400 (0:00:00.052) 0:00:27.032 ****** 30529 1726882613.00686: entering _queue_task() for managed_node1/debug 30529 1726882613.01031: worker is 1 (out of 1 available) 30529 1726882613.01041: exiting _queue_task() for managed_node1/debug 30529 1726882613.01053: done queuing things up, now waiting for results queue to drain 30529 1726882613.01055: waiting for pending results... 30529 1726882613.01809: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882613.01815: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4a 30529 1726882613.01819: variable 'ansible_search_path' from source: unknown 30529 1726882613.01821: variable 'ansible_search_path' from source: unknown 30529 1726882613.01823: variable 'omit' from source: magic vars 30529 1726882613.01953: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.01966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.01981: variable 'omit' from source: magic vars 30529 1726882613.02314: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.02331: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.02340: variable 'omit' from source: magic vars 30529 1726882613.02376: variable 'omit' from source: magic vars 30529 1726882613.02430: variable 'item' from source: unknown 30529 1726882613.02561: variable 'item' from source: unknown 30529 1726882613.02582: variable 'omit' from source: magic vars 30529 1726882613.02638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882613.02798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.02801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882613.02804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.02806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.02808: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.02810: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.02812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.02906: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.02916: Set connection var ansible_pipelining to False 30529 1726882613.02919: Set connection var ansible_shell_type to sh 30529 1726882613.02937: Set connection var ansible_timeout to 10 30529 1726882613.02940: Set connection var ansible_connection to ssh 30529 1726882613.02944: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.02964: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.02967: variable 'ansible_connection' from source: unknown 30529 1726882613.02969: variable 'ansible_module_compression' from source: unknown 30529 1726882613.02971: variable 'ansible_shell_type' from source: unknown 30529 1726882613.02974: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.02976: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.02982: variable 'ansible_pipelining' from source: unknown 30529 1726882613.02984: variable 'ansible_timeout' from source: unknown 30529 1726882613.02987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.03153: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.03168: variable 'omit' from source: magic vars 30529 1726882613.03172: starting attempt loop 30529 1726882613.03175: running the handler 30529 1726882613.03224: variable 'lsr_description' from source: include params 30529 1726882613.03389: variable 'lsr_description' from source: include params 30529 1726882613.03392: handler run complete 30529 1726882613.03397: attempt loop complete, returning result 30529 1726882613.03399: variable 'item' from source: unknown 30529 1726882613.03414: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 30529 1726882613.03563: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.03567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.03570: variable 'omit' from source: magic vars 30529 1726882613.03900: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.03903: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.03906: variable 'omit' from source: magic vars 30529 1726882613.03909: variable 'omit' from source: magic vars 30529 1726882613.03912: variable 'item' from source: unknown 30529 1726882613.03915: variable 'item' from source: unknown 30529 1726882613.03918: variable 'omit' from source: magic vars 30529 1726882613.03921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.03923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.03927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.03929: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.03932: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.03934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.04010: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.04020: Set connection var ansible_pipelining to False 30529 1726882613.04023: Set connection var ansible_shell_type to sh 30529 1726882613.04033: Set connection var ansible_timeout to 10 30529 1726882613.04036: Set connection var ansible_connection to ssh 30529 1726882613.04041: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.04064: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.04067: variable 'ansible_connection' from source: unknown 30529 1726882613.04070: variable 'ansible_module_compression' from source: unknown 30529 1726882613.04072: variable 'ansible_shell_type' from source: unknown 30529 1726882613.04075: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.04077: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.04080: variable 'ansible_pipelining' from source: unknown 30529 1726882613.04082: variable 'ansible_timeout' from source: unknown 30529 1726882613.04086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.04188: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.04202: variable 'omit' from source: magic vars 30529 1726882613.04205: starting attempt loop 30529 1726882613.04208: running the handler 30529 1726882613.04239: variable 'lsr_setup' from source: include params 30529 1726882613.04313: variable 'lsr_setup' from source: include params 30529 1726882613.04406: handler run complete 30529 1726882613.04409: attempt loop complete, returning result 30529 1726882613.04411: variable 'item' from source: unknown 30529 1726882613.04464: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 30529 1726882613.04679: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.04682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.04684: variable 'omit' from source: magic vars 30529 1726882613.04729: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.04735: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.04737: variable 'omit' from source: magic vars 30529 1726882613.04753: variable 'omit' from source: magic vars 30529 1726882613.04804: variable 'item' from source: unknown 30529 1726882613.04863: variable 'item' from source: unknown 30529 1726882613.04886: variable 'omit' from source: magic vars 30529 1726882613.04909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.04917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.04922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.04933: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.04936: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.04938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.05037: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.05040: Set connection var ansible_pipelining to False 30529 1726882613.05043: Set connection var ansible_shell_type to sh 30529 1726882613.05045: Set connection var ansible_timeout to 10 30529 1726882613.05047: Set connection var ansible_connection to ssh 30529 1726882613.05049: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.05104: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.05107: variable 'ansible_connection' from source: unknown 30529 1726882613.05109: variable 'ansible_module_compression' from source: unknown 30529 1726882613.05111: variable 'ansible_shell_type' from source: unknown 30529 1726882613.05113: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.05115: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.05117: variable 'ansible_pipelining' from source: unknown 30529 1726882613.05119: variable 'ansible_timeout' from source: unknown 30529 1726882613.05121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.05213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.05217: variable 'omit' from source: magic vars 30529 1726882613.05219: starting attempt loop 30529 1726882613.05221: running the handler 30529 1726882613.05223: variable 'lsr_test' from source: include params 30529 1726882613.05307: variable 'lsr_test' from source: include params 30529 1726882613.05310: handler run complete 30529 1726882613.05313: attempt loop complete, returning result 30529 1726882613.05333: variable 'item' from source: unknown 30529 1726882613.05390: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 30529 1726882613.05474: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.05478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.05598: variable 'omit' from source: magic vars 30529 1726882613.05659: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.05663: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.05675: variable 'omit' from source: magic vars 30529 1726882613.05682: variable 'omit' from source: magic vars 30529 1726882613.05767: variable 'item' from source: unknown 30529 1726882613.05791: variable 'item' from source: unknown 30529 1726882613.05810: variable 'omit' from source: magic vars 30529 1726882613.05833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.05839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.05898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.05901: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.05904: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.05906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.05944: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.05947: Set connection var ansible_pipelining to False 30529 1726882613.05949: Set connection var ansible_shell_type to sh 30529 1726882613.05960: Set connection var ansible_timeout to 10 30529 1726882613.05962: Set connection var ansible_connection to ssh 30529 1726882613.05975: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.05991: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.05998: variable 'ansible_connection' from source: unknown 30529 1726882613.06001: variable 'ansible_module_compression' from source: unknown 30529 1726882613.06006: variable 'ansible_shell_type' from source: unknown 30529 1726882613.06008: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.06010: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.06012: variable 'ansible_pipelining' from source: unknown 30529 1726882613.06014: variable 'ansible_timeout' from source: unknown 30529 1726882613.06068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.06112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.06119: variable 'omit' from source: magic vars 30529 1726882613.06121: starting attempt loop 30529 1726882613.06124: running the handler 30529 1726882613.06145: variable 'lsr_assert' from source: include params 30529 1726882613.06213: variable 'lsr_assert' from source: include params 30529 1726882613.06286: handler run complete 30529 1726882613.06289: attempt loop complete, returning result 30529 1726882613.06291: variable 'item' from source: unknown 30529 1726882613.06317: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 30529 1726882613.06540: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.06546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.06550: variable 'omit' from source: magic vars 30529 1726882613.06618: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.06625: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.06631: variable 'omit' from source: magic vars 30529 1726882613.06648: variable 'omit' from source: magic vars 30529 1726882613.06698: variable 'item' from source: unknown 30529 1726882613.06743: variable 'item' from source: unknown 30529 1726882613.06753: variable 'omit' from source: magic vars 30529 1726882613.06772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.06775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.06780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.06788: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.06796: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.06798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.06843: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.06846: Set connection var ansible_pipelining to False 30529 1726882613.06848: Set connection var ansible_shell_type to sh 30529 1726882613.06856: Set connection var ansible_timeout to 10 30529 1726882613.06858: Set connection var ansible_connection to ssh 30529 1726882613.06863: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.06879: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.06882: variable 'ansible_connection' from source: unknown 30529 1726882613.06884: variable 'ansible_module_compression' from source: unknown 30529 1726882613.06887: variable 'ansible_shell_type' from source: unknown 30529 1726882613.06891: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.06896: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.06898: variable 'ansible_pipelining' from source: unknown 30529 1726882613.06900: variable 'ansible_timeout' from source: unknown 30529 1726882613.06902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.06959: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.06965: variable 'omit' from source: magic vars 30529 1726882613.06968: starting attempt loop 30529 1726882613.06970: running the handler 30529 1726882613.07049: handler run complete 30529 1726882613.07058: attempt loop complete, returning result 30529 1726882613.07068: variable 'item' from source: unknown 30529 1726882613.07114: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30529 1726882613.07180: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.07183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.07196: variable 'omit' from source: magic vars 30529 1726882613.07285: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.07291: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.07302: variable 'omit' from source: magic vars 30529 1726882613.07312: variable 'omit' from source: magic vars 30529 1726882613.07336: variable 'item' from source: unknown 30529 1726882613.07377: variable 'item' from source: unknown 30529 1726882613.07387: variable 'omit' from source: magic vars 30529 1726882613.07409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.07412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.07416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.07423: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.07425: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.07430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.07471: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.07474: Set connection var ansible_pipelining to False 30529 1726882613.07477: Set connection var ansible_shell_type to sh 30529 1726882613.07484: Set connection var ansible_timeout to 10 30529 1726882613.07487: Set connection var ansible_connection to ssh 30529 1726882613.07495: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.07507: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.07516: variable 'ansible_connection' from source: unknown 30529 1726882613.07519: variable 'ansible_module_compression' from source: unknown 30529 1726882613.07521: variable 'ansible_shell_type' from source: unknown 30529 1726882613.07523: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.07527: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.07529: variable 'ansible_pipelining' from source: unknown 30529 1726882613.07531: variable 'ansible_timeout' from source: unknown 30529 1726882613.07533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.07583: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.07592: variable 'omit' from source: magic vars 30529 1726882613.07596: starting attempt loop 30529 1726882613.07599: running the handler 30529 1726882613.07610: variable 'lsr_fail_debug' from source: play vars 30529 1726882613.07656: variable 'lsr_fail_debug' from source: play vars 30529 1726882613.07668: handler run complete 30529 1726882613.07678: attempt loop complete, returning result 30529 1726882613.07688: variable 'item' from source: unknown 30529 1726882613.07736: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882613.07813: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.07816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.07818: variable 'omit' from source: magic vars 30529 1726882613.07909: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.07912: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.07915: variable 'omit' from source: magic vars 30529 1726882613.07926: variable 'omit' from source: magic vars 30529 1726882613.07952: variable 'item' from source: unknown 30529 1726882613.07997: variable 'item' from source: unknown 30529 1726882613.08006: variable 'omit' from source: magic vars 30529 1726882613.08021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.08031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.08034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.08042: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.08045: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.08047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.08095: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.08098: Set connection var ansible_pipelining to False 30529 1726882613.08101: Set connection var ansible_shell_type to sh 30529 1726882613.08107: Set connection var ansible_timeout to 10 30529 1726882613.08109: Set connection var ansible_connection to ssh 30529 1726882613.08114: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.08128: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.08131: variable 'ansible_connection' from source: unknown 30529 1726882613.08133: variable 'ansible_module_compression' from source: unknown 30529 1726882613.08140: variable 'ansible_shell_type' from source: unknown 30529 1726882613.08143: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.08145: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.08147: variable 'ansible_pipelining' from source: unknown 30529 1726882613.08149: variable 'ansible_timeout' from source: unknown 30529 1726882613.08151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.08209: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.08215: variable 'omit' from source: magic vars 30529 1726882613.08218: starting attempt loop 30529 1726882613.08220: running the handler 30529 1726882613.08235: variable 'lsr_cleanup' from source: include params 30529 1726882613.08279: variable 'lsr_cleanup' from source: include params 30529 1726882613.08294: handler run complete 30529 1726882613.08303: attempt loop complete, returning result 30529 1726882613.08313: variable 'item' from source: unknown 30529 1726882613.08356: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30529 1726882613.08431: dumping result to json 30529 1726882613.08435: done dumping result, returning 30529 1726882613.08437: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-000000000a4a] 30529 1726882613.08440: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4a 30529 1726882613.08481: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4a 30529 1726882613.08484: WORKER PROCESS EXITING 30529 1726882613.08535: no more pending results, returning what we have 30529 1726882613.08538: results queue empty 30529 1726882613.08539: checking for any_errors_fatal 30529 1726882613.08546: done checking for any_errors_fatal 30529 1726882613.08547: checking for max_fail_percentage 30529 1726882613.08549: done checking for max_fail_percentage 30529 1726882613.08550: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.08551: done checking to see if all hosts have failed 30529 1726882613.08551: getting the remaining hosts for this loop 30529 1726882613.08553: done getting the remaining hosts for this loop 30529 1726882613.08556: getting the next task for host managed_node1 30529 1726882613.08563: done getting next task for host managed_node1 30529 1726882613.08565: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882613.08568: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.08572: getting variables 30529 1726882613.08573: in VariableManager get_vars() 30529 1726882613.08616: Calling all_inventory to load vars for managed_node1 30529 1726882613.08618: Calling groups_inventory to load vars for managed_node1 30529 1726882613.08622: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.08632: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.08635: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.08637: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.09874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.10860: done with get_vars() 30529 1726882613.10875: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:36:53 -0400 (0:00:00.102) 0:00:27.135 ****** 30529 1726882613.10941: entering _queue_task() for managed_node1/include_tasks 30529 1726882613.11157: worker is 1 (out of 1 available) 30529 1726882613.11170: exiting _queue_task() for managed_node1/include_tasks 30529 1726882613.11183: done queuing things up, now waiting for results queue to drain 30529 1726882613.11185: waiting for pending results... 30529 1726882613.11358: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882613.11431: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4b 30529 1726882613.11442: variable 'ansible_search_path' from source: unknown 30529 1726882613.11446: variable 'ansible_search_path' from source: unknown 30529 1726882613.11472: calling self._execute() 30529 1726882613.11543: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.11547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.11557: variable 'omit' from source: magic vars 30529 1726882613.11826: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.11835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.11842: _execute() done 30529 1726882613.11847: dumping result to json 30529 1726882613.11849: done dumping result, returning 30529 1726882613.11852: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-000000000a4b] 30529 1726882613.11863: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4b 30529 1726882613.11938: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4b 30529 1726882613.11940: WORKER PROCESS EXITING 30529 1726882613.11995: no more pending results, returning what we have 30529 1726882613.12000: in VariableManager get_vars() 30529 1726882613.12033: Calling all_inventory to load vars for managed_node1 30529 1726882613.12036: Calling groups_inventory to load vars for managed_node1 30529 1726882613.12039: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.12048: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.12051: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.12053: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.12811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.14157: done with get_vars() 30529 1726882613.14180: variable 'ansible_search_path' from source: unknown 30529 1726882613.14181: variable 'ansible_search_path' from source: unknown 30529 1726882613.14222: we have included files to process 30529 1726882613.14224: generating all_blocks data 30529 1726882613.14226: done generating all_blocks data 30529 1726882613.14232: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882613.14234: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882613.14236: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882613.14341: in VariableManager get_vars() 30529 1726882613.14361: done with get_vars() 30529 1726882613.14472: done processing included file 30529 1726882613.14474: iterating over new_blocks loaded from include file 30529 1726882613.14475: in VariableManager get_vars() 30529 1726882613.14488: done with get_vars() 30529 1726882613.14490: filtering new block on tags 30529 1726882613.14525: done filtering new block on tags 30529 1726882613.14527: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882613.14533: extending task lists for all hosts with included blocks 30529 1726882613.14975: done extending task lists 30529 1726882613.14977: done processing included files 30529 1726882613.14978: results queue empty 30529 1726882613.14978: checking for any_errors_fatal 30529 1726882613.14984: done checking for any_errors_fatal 30529 1726882613.14985: checking for max_fail_percentage 30529 1726882613.14986: done checking for max_fail_percentage 30529 1726882613.14986: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.14987: done checking to see if all hosts have failed 30529 1726882613.14988: getting the remaining hosts for this loop 30529 1726882613.14989: done getting the remaining hosts for this loop 30529 1726882613.14992: getting the next task for host managed_node1 30529 1726882613.14998: done getting next task for host managed_node1 30529 1726882613.15000: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882613.15003: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.15006: getting variables 30529 1726882613.15007: in VariableManager get_vars() 30529 1726882613.15016: Calling all_inventory to load vars for managed_node1 30529 1726882613.15018: Calling groups_inventory to load vars for managed_node1 30529 1726882613.15021: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.15026: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.15029: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.15031: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.16238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.17757: done with get_vars() 30529 1726882613.17778: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.069) 0:00:27.204 ****** 30529 1726882613.17851: entering _queue_task() for managed_node1/include_tasks 30529 1726882613.18199: worker is 1 (out of 1 available) 30529 1726882613.18213: exiting _queue_task() for managed_node1/include_tasks 30529 1726882613.18226: done queuing things up, now waiting for results queue to drain 30529 1726882613.18227: waiting for pending results... 30529 1726882613.18615: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882613.18628: in run() - task 12673a56-9f93-b0f1-edc0-000000000a72 30529 1726882613.18649: variable 'ansible_search_path' from source: unknown 30529 1726882613.18657: variable 'ansible_search_path' from source: unknown 30529 1726882613.18698: calling self._execute() 30529 1726882613.18792: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.18807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.18827: variable 'omit' from source: magic vars 30529 1726882613.19196: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.19214: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.19225: _execute() done 30529 1726882613.19233: dumping result to json 30529 1726882613.19239: done dumping result, returning 30529 1726882613.19250: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-000000000a72] 30529 1726882613.19262: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a72 30529 1726882613.19388: no more pending results, returning what we have 30529 1726882613.19396: in VariableManager get_vars() 30529 1726882613.19433: Calling all_inventory to load vars for managed_node1 30529 1726882613.19436: Calling groups_inventory to load vars for managed_node1 30529 1726882613.19440: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.19454: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.19458: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.19461: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.20206: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a72 30529 1726882613.20210: WORKER PROCESS EXITING 30529 1726882613.21035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.22674: done with get_vars() 30529 1726882613.22695: variable 'ansible_search_path' from source: unknown 30529 1726882613.22696: variable 'ansible_search_path' from source: unknown 30529 1726882613.22732: we have included files to process 30529 1726882613.22733: generating all_blocks data 30529 1726882613.22735: done generating all_blocks data 30529 1726882613.22736: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882613.22737: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882613.22739: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882613.23003: done processing included file 30529 1726882613.23005: iterating over new_blocks loaded from include file 30529 1726882613.23007: in VariableManager get_vars() 30529 1726882613.23022: done with get_vars() 30529 1726882613.23024: filtering new block on tags 30529 1726882613.23061: done filtering new block on tags 30529 1726882613.23064: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882613.23070: extending task lists for all hosts with included blocks 30529 1726882613.23229: done extending task lists 30529 1726882613.23231: done processing included files 30529 1726882613.23232: results queue empty 30529 1726882613.23232: checking for any_errors_fatal 30529 1726882613.23236: done checking for any_errors_fatal 30529 1726882613.23237: checking for max_fail_percentage 30529 1726882613.23238: done checking for max_fail_percentage 30529 1726882613.23238: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.23239: done checking to see if all hosts have failed 30529 1726882613.23240: getting the remaining hosts for this loop 30529 1726882613.23241: done getting the remaining hosts for this loop 30529 1726882613.23244: getting the next task for host managed_node1 30529 1726882613.23248: done getting next task for host managed_node1 30529 1726882613.23250: ^ task is: TASK: Gather current interface info 30529 1726882613.23254: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.23256: getting variables 30529 1726882613.23257: in VariableManager get_vars() 30529 1726882613.23266: Calling all_inventory to load vars for managed_node1 30529 1726882613.23268: Calling groups_inventory to load vars for managed_node1 30529 1726882613.23270: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.23276: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.23278: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.23281: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.24384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.25863: done with get_vars() 30529 1726882613.25883: done getting variables 30529 1726882613.25924: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.081) 0:00:27.285 ****** 30529 1726882613.25954: entering _queue_task() for managed_node1/command 30529 1726882613.26310: worker is 1 (out of 1 available) 30529 1726882613.26323: exiting _queue_task() for managed_node1/command 30529 1726882613.26336: done queuing things up, now waiting for results queue to drain 30529 1726882613.26338: waiting for pending results... 30529 1726882613.26618: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882613.26753: in run() - task 12673a56-9f93-b0f1-edc0-000000000aad 30529 1726882613.26774: variable 'ansible_search_path' from source: unknown 30529 1726882613.26781: variable 'ansible_search_path' from source: unknown 30529 1726882613.26998: calling self._execute() 30529 1726882613.27002: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.27004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.27007: variable 'omit' from source: magic vars 30529 1726882613.27299: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.27305: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.27324: variable 'omit' from source: magic vars 30529 1726882613.27368: variable 'omit' from source: magic vars 30529 1726882613.27396: variable 'omit' from source: magic vars 30529 1726882613.27425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882613.27454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.27471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882613.27486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.27498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.27521: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.27524: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.27528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.27605: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.27608: Set connection var ansible_pipelining to False 30529 1726882613.27611: Set connection var ansible_shell_type to sh 30529 1726882613.27624: Set connection var ansible_timeout to 10 30529 1726882613.27627: Set connection var ansible_connection to ssh 30529 1726882613.27645: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.27654: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.27658: variable 'ansible_connection' from source: unknown 30529 1726882613.27661: variable 'ansible_module_compression' from source: unknown 30529 1726882613.27664: variable 'ansible_shell_type' from source: unknown 30529 1726882613.27666: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.27668: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.27670: variable 'ansible_pipelining' from source: unknown 30529 1726882613.27672: variable 'ansible_timeout' from source: unknown 30529 1726882613.27677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.27775: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.27786: variable 'omit' from source: magic vars 30529 1726882613.27795: starting attempt loop 30529 1726882613.27798: running the handler 30529 1726882613.27811: _low_level_execute_command(): starting 30529 1726882613.27818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882613.28326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882613.28332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.28335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882613.28337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.28389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882613.28396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882613.28402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.28453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.30170: stdout chunk (state=3): >>>/root <<< 30529 1726882613.30254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882613.30297: stderr chunk (state=3): >>><<< 30529 1726882613.30302: stdout chunk (state=3): >>><<< 30529 1726882613.30321: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882613.30330: _low_level_execute_command(): starting 30529 1726882613.30336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884 `" && echo ansible-tmp-1726882613.3031933-31792-236564112076884="` echo /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884 `" ) && sleep 0' 30529 1726882613.31052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.31063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882613.31065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882613.31078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.31155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.33020: stdout chunk (state=3): >>>ansible-tmp-1726882613.3031933-31792-236564112076884=/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884 <<< 30529 1726882613.33124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882613.33148: stderr chunk (state=3): >>><<< 30529 1726882613.33151: stdout chunk (state=3): >>><<< 30529 1726882613.33166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882613.3031933-31792-236564112076884=/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882613.33196: variable 'ansible_module_compression' from source: unknown 30529 1726882613.33238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882613.33265: variable 'ansible_facts' from source: unknown 30529 1726882613.33325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py 30529 1726882613.33428: Sending initial data 30529 1726882613.33432: Sent initial data (156 bytes) 30529 1726882613.33878: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882613.33884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.33887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882613.33889: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882613.33891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.33981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.34013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.35570: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30529 1726882613.35573: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882613.35610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882613.35654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8v2uypnf /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py <<< 30529 1726882613.35664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py" <<< 30529 1726882613.35698: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8v2uypnf" to remote "/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py" <<< 30529 1726882613.35702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py" <<< 30529 1726882613.36230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882613.36264: stderr chunk (state=3): >>><<< 30529 1726882613.36267: stdout chunk (state=3): >>><<< 30529 1726882613.36309: done transferring module to remote 30529 1726882613.36319: _low_level_execute_command(): starting 30529 1726882613.36327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/ /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py && sleep 0' 30529 1726882613.36729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882613.36733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.36749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.36799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882613.36811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.36850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.38600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882613.38631: stderr chunk (state=3): >>><<< 30529 1726882613.38634: stdout chunk (state=3): >>><<< 30529 1726882613.38642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882613.38645: _low_level_execute_command(): starting 30529 1726882613.38650: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/AnsiballZ_command.py && sleep 0' 30529 1726882613.39104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882613.39107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882613.39110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.39112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882613.39114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.39164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882613.39168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.39219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.54419: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:53.540119", "end": "2024-09-20 21:36:53.543190", "delta": "0:00:00.003071", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882613.56199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882613.56203: stdout chunk (state=3): >>><<< 30529 1726882613.56205: stderr chunk (state=3): >>><<< 30529 1726882613.56210: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:53.540119", "end": "2024-09-20 21:36:53.543190", "delta": "0:00:00.003071", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882613.56257: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882613.56268: _low_level_execute_command(): starting 30529 1726882613.56274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882613.3031933-31792-236564112076884/ > /dev/null 2>&1 && sleep 0' 30529 1726882613.56998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882613.57078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882613.57126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882613.57140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882613.57214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882613.59098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882613.59101: stderr chunk (state=3): >>><<< 30529 1726882613.59104: stdout chunk (state=3): >>><<< 30529 1726882613.59106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882613.59109: handler run complete 30529 1726882613.59111: Evaluated conditional (False): False 30529 1726882613.59112: attempt loop complete, returning result 30529 1726882613.59114: _execute() done 30529 1726882613.59116: dumping result to json 30529 1726882613.59118: done dumping result, returning 30529 1726882613.59120: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-000000000aad] 30529 1726882613.59122: sending task result for task 12673a56-9f93-b0f1-edc0-000000000aad ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003071", "end": "2024-09-20 21:36:53.543190", "rc": 0, "start": "2024-09-20 21:36:53.540119" } STDOUT: bonding_masters eth0 lo 30529 1726882613.59287: no more pending results, returning what we have 30529 1726882613.59290: results queue empty 30529 1726882613.59292: checking for any_errors_fatal 30529 1726882613.59295: done checking for any_errors_fatal 30529 1726882613.59296: checking for max_fail_percentage 30529 1726882613.59298: done checking for max_fail_percentage 30529 1726882613.59299: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.59300: done checking to see if all hosts have failed 30529 1726882613.59301: getting the remaining hosts for this loop 30529 1726882613.59496: done getting the remaining hosts for this loop 30529 1726882613.59501: getting the next task for host managed_node1 30529 1726882613.59509: done getting next task for host managed_node1 30529 1726882613.59511: ^ task is: TASK: Set current_interfaces 30529 1726882613.59515: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.59520: getting variables 30529 1726882613.59521: in VariableManager get_vars() 30529 1726882613.59552: Calling all_inventory to load vars for managed_node1 30529 1726882613.59555: Calling groups_inventory to load vars for managed_node1 30529 1726882613.59559: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.59570: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.59574: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.59577: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.60098: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000aad 30529 1726882613.60102: WORKER PROCESS EXITING 30529 1726882613.61248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.68637: done with get_vars() 30529 1726882613.68660: done getting variables 30529 1726882613.68710: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:53 -0400 (0:00:00.427) 0:00:27.713 ****** 30529 1726882613.68733: entering _queue_task() for managed_node1/set_fact 30529 1726882613.69049: worker is 1 (out of 1 available) 30529 1726882613.69061: exiting _queue_task() for managed_node1/set_fact 30529 1726882613.69072: done queuing things up, now waiting for results queue to drain 30529 1726882613.69073: waiting for pending results... 30529 1726882613.69512: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882613.69517: in run() - task 12673a56-9f93-b0f1-edc0-000000000aae 30529 1726882613.69532: variable 'ansible_search_path' from source: unknown 30529 1726882613.69542: variable 'ansible_search_path' from source: unknown 30529 1726882613.69581: calling self._execute() 30529 1726882613.69685: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.69701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.69717: variable 'omit' from source: magic vars 30529 1726882613.70102: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.70120: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.70131: variable 'omit' from source: magic vars 30529 1726882613.70188: variable 'omit' from source: magic vars 30529 1726882613.70310: variable '_current_interfaces' from source: set_fact 30529 1726882613.70379: variable 'omit' from source: magic vars 30529 1726882613.70430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882613.70473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.70617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882613.70621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.70624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.70627: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.70629: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.70632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.70703: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.70714: Set connection var ansible_pipelining to False 30529 1726882613.70724: Set connection var ansible_shell_type to sh 30529 1726882613.70737: Set connection var ansible_timeout to 10 30529 1726882613.70744: Set connection var ansible_connection to ssh 30529 1726882613.70757: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.70783: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.70797: variable 'ansible_connection' from source: unknown 30529 1726882613.70806: variable 'ansible_module_compression' from source: unknown 30529 1726882613.70813: variable 'ansible_shell_type' from source: unknown 30529 1726882613.70822: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.70833: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.70841: variable 'ansible_pipelining' from source: unknown 30529 1726882613.70848: variable 'ansible_timeout' from source: unknown 30529 1726882613.70856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.70997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.71020: variable 'omit' from source: magic vars 30529 1726882613.71030: starting attempt loop 30529 1726882613.71037: running the handler 30529 1726882613.71159: handler run complete 30529 1726882613.71162: attempt loop complete, returning result 30529 1726882613.71164: _execute() done 30529 1726882613.71166: dumping result to json 30529 1726882613.71168: done dumping result, returning 30529 1726882613.71171: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-000000000aae] 30529 1726882613.71172: sending task result for task 12673a56-9f93-b0f1-edc0-000000000aae 30529 1726882613.71245: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000aae 30529 1726882613.71248: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882613.71322: no more pending results, returning what we have 30529 1726882613.71325: results queue empty 30529 1726882613.71326: checking for any_errors_fatal 30529 1726882613.71335: done checking for any_errors_fatal 30529 1726882613.71336: checking for max_fail_percentage 30529 1726882613.71338: done checking for max_fail_percentage 30529 1726882613.71339: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.71340: done checking to see if all hosts have failed 30529 1726882613.71340: getting the remaining hosts for this loop 30529 1726882613.71342: done getting the remaining hosts for this loop 30529 1726882613.71347: getting the next task for host managed_node1 30529 1726882613.71357: done getting next task for host managed_node1 30529 1726882613.71359: ^ task is: TASK: Show current_interfaces 30529 1726882613.71363: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.71369: getting variables 30529 1726882613.71371: in VariableManager get_vars() 30529 1726882613.71408: Calling all_inventory to load vars for managed_node1 30529 1726882613.71411: Calling groups_inventory to load vars for managed_node1 30529 1726882613.71415: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.71427: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.71431: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.71434: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.72961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.74527: done with get_vars() 30529 1726882613.74546: done getting variables 30529 1726882613.74606: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:53 -0400 (0:00:00.059) 0:00:27.772 ****** 30529 1726882613.74638: entering _queue_task() for managed_node1/debug 30529 1726882613.75111: worker is 1 (out of 1 available) 30529 1726882613.75122: exiting _queue_task() for managed_node1/debug 30529 1726882613.75131: done queuing things up, now waiting for results queue to drain 30529 1726882613.75133: waiting for pending results... 30529 1726882613.75206: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882613.75357: in run() - task 12673a56-9f93-b0f1-edc0-000000000a73 30529 1726882613.75360: variable 'ansible_search_path' from source: unknown 30529 1726882613.75363: variable 'ansible_search_path' from source: unknown 30529 1726882613.75467: calling self._execute() 30529 1726882613.75485: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.75501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.75516: variable 'omit' from source: magic vars 30529 1726882613.75883: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.75908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.75920: variable 'omit' from source: magic vars 30529 1726882613.75968: variable 'omit' from source: magic vars 30529 1726882613.76075: variable 'current_interfaces' from source: set_fact 30529 1726882613.76110: variable 'omit' from source: magic vars 30529 1726882613.76154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882613.76197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882613.76228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882613.76299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.76302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882613.76304: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882613.76306: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.76308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.76420: Set connection var ansible_shell_executable to /bin/sh 30529 1726882613.76431: Set connection var ansible_pipelining to False 30529 1726882613.76439: Set connection var ansible_shell_type to sh 30529 1726882613.76459: Set connection var ansible_timeout to 10 30529 1726882613.76466: Set connection var ansible_connection to ssh 30529 1726882613.76475: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882613.76506: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.76557: variable 'ansible_connection' from source: unknown 30529 1726882613.76560: variable 'ansible_module_compression' from source: unknown 30529 1726882613.76562: variable 'ansible_shell_type' from source: unknown 30529 1726882613.76564: variable 'ansible_shell_executable' from source: unknown 30529 1726882613.76566: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.76568: variable 'ansible_pipelining' from source: unknown 30529 1726882613.76570: variable 'ansible_timeout' from source: unknown 30529 1726882613.76572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.76699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882613.76717: variable 'omit' from source: magic vars 30529 1726882613.76727: starting attempt loop 30529 1726882613.76733: running the handler 30529 1726882613.76787: handler run complete 30529 1726882613.76884: attempt loop complete, returning result 30529 1726882613.76887: _execute() done 30529 1726882613.76894: dumping result to json 30529 1726882613.76897: done dumping result, returning 30529 1726882613.76900: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-000000000a73] 30529 1726882613.76902: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a73 30529 1726882613.76965: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a73 30529 1726882613.76968: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882613.77035: no more pending results, returning what we have 30529 1726882613.77039: results queue empty 30529 1726882613.77040: checking for any_errors_fatal 30529 1726882613.77045: done checking for any_errors_fatal 30529 1726882613.77046: checking for max_fail_percentage 30529 1726882613.77048: done checking for max_fail_percentage 30529 1726882613.77049: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.77050: done checking to see if all hosts have failed 30529 1726882613.77051: getting the remaining hosts for this loop 30529 1726882613.77053: done getting the remaining hosts for this loop 30529 1726882613.77057: getting the next task for host managed_node1 30529 1726882613.77066: done getting next task for host managed_node1 30529 1726882613.77071: ^ task is: TASK: Setup 30529 1726882613.77074: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.77078: getting variables 30529 1726882613.77080: in VariableManager get_vars() 30529 1726882613.77114: Calling all_inventory to load vars for managed_node1 30529 1726882613.77116: Calling groups_inventory to load vars for managed_node1 30529 1726882613.77120: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.77132: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.77135: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.77138: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.78776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.80305: done with get_vars() 30529 1726882613.80324: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:36:53 -0400 (0:00:00.057) 0:00:27.830 ****** 30529 1726882613.80408: entering _queue_task() for managed_node1/include_tasks 30529 1726882613.80885: worker is 1 (out of 1 available) 30529 1726882613.81001: exiting _queue_task() for managed_node1/include_tasks 30529 1726882613.81014: done queuing things up, now waiting for results queue to drain 30529 1726882613.81015: waiting for pending results... 30529 1726882613.81610: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882613.81615: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4c 30529 1726882613.81631: variable 'ansible_search_path' from source: unknown 30529 1726882613.81635: variable 'ansible_search_path' from source: unknown 30529 1726882613.81740: variable 'lsr_setup' from source: include params 30529 1726882613.82224: variable 'lsr_setup' from source: include params 30529 1726882613.82406: variable 'omit' from source: magic vars 30529 1726882613.82698: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.82702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.82705: variable 'omit' from source: magic vars 30529 1726882613.83334: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.83338: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.83342: variable 'item' from source: unknown 30529 1726882613.83385: variable 'item' from source: unknown 30529 1726882613.83599: variable 'item' from source: unknown 30529 1726882613.83603: variable 'item' from source: unknown 30529 1726882613.84006: dumping result to json 30529 1726882613.84009: done dumping result, returning 30529 1726882613.84011: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-000000000a4c] 30529 1726882613.84013: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4c 30529 1726882613.84050: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4c 30529 1726882613.84053: WORKER PROCESS EXITING 30529 1726882613.84130: no more pending results, returning what we have 30529 1726882613.84134: in VariableManager get_vars() 30529 1726882613.84169: Calling all_inventory to load vars for managed_node1 30529 1726882613.84171: Calling groups_inventory to load vars for managed_node1 30529 1726882613.84174: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.84187: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.84192: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.84197: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.87528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.91153: done with get_vars() 30529 1726882613.91175: variable 'ansible_search_path' from source: unknown 30529 1726882613.91177: variable 'ansible_search_path' from source: unknown 30529 1726882613.91223: we have included files to process 30529 1726882613.91225: generating all_blocks data 30529 1726882613.91227: done generating all_blocks data 30529 1726882613.91232: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882613.91233: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882613.91236: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882613.91683: done processing included file 30529 1726882613.91685: iterating over new_blocks loaded from include file 30529 1726882613.91687: in VariableManager get_vars() 30529 1726882613.91909: done with get_vars() 30529 1726882613.91911: filtering new block on tags 30529 1726882613.91946: done filtering new block on tags 30529 1726882613.91948: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node1 => (item=tasks/create_bridge_profile.yml) 30529 1726882613.91953: extending task lists for all hosts with included blocks 30529 1726882613.93209: done extending task lists 30529 1726882613.93211: done processing included files 30529 1726882613.93212: results queue empty 30529 1726882613.93212: checking for any_errors_fatal 30529 1726882613.93217: done checking for any_errors_fatal 30529 1726882613.93218: checking for max_fail_percentage 30529 1726882613.93219: done checking for max_fail_percentage 30529 1726882613.93220: checking to see if all hosts have failed and the running result is not ok 30529 1726882613.93221: done checking to see if all hosts have failed 30529 1726882613.93222: getting the remaining hosts for this loop 30529 1726882613.93223: done getting the remaining hosts for this loop 30529 1726882613.93226: getting the next task for host managed_node1 30529 1726882613.93230: done getting next task for host managed_node1 30529 1726882613.93232: ^ task is: TASK: Include network role 30529 1726882613.93234: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882613.93237: getting variables 30529 1726882613.93238: in VariableManager get_vars() 30529 1726882613.93247: Calling all_inventory to load vars for managed_node1 30529 1726882613.93250: Calling groups_inventory to load vars for managed_node1 30529 1726882613.93252: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.93257: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.93260: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.93263: Calling groups_plugins_play to load vars for managed_node1 30529 1726882613.94650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882613.97283: done with get_vars() 30529 1726882613.97311: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.169) 0:00:27.999 ****** 30529 1726882613.97388: entering _queue_task() for managed_node1/include_role 30529 1726882613.97753: worker is 1 (out of 1 available) 30529 1726882613.97999: exiting _queue_task() for managed_node1/include_role 30529 1726882613.98010: done queuing things up, now waiting for results queue to drain 30529 1726882613.98012: waiting for pending results... 30529 1726882613.98144: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882613.98217: in run() - task 12673a56-9f93-b0f1-edc0-000000000ad1 30529 1726882613.98242: variable 'ansible_search_path' from source: unknown 30529 1726882613.98249: variable 'ansible_search_path' from source: unknown 30529 1726882613.98288: calling self._execute() 30529 1726882613.98398: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882613.98458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882613.98463: variable 'omit' from source: magic vars 30529 1726882613.98829: variable 'ansible_distribution_major_version' from source: facts 30529 1726882613.98846: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882613.98858: _execute() done 30529 1726882613.98867: dumping result to json 30529 1726882613.98874: done dumping result, returning 30529 1726882613.98898: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000000ad1] 30529 1726882613.99001: sending task result for task 12673a56-9f93-b0f1-edc0-000000000ad1 30529 1726882613.99083: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000ad1 30529 1726882613.99086: WORKER PROCESS EXITING 30529 1726882613.99323: no more pending results, returning what we have 30529 1726882613.99328: in VariableManager get_vars() 30529 1726882613.99365: Calling all_inventory to load vars for managed_node1 30529 1726882613.99368: Calling groups_inventory to load vars for managed_node1 30529 1726882613.99372: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882613.99387: Calling all_plugins_play to load vars for managed_node1 30529 1726882613.99396: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882613.99399: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.00948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.04387: done with get_vars() 30529 1726882614.04412: variable 'ansible_search_path' from source: unknown 30529 1726882614.04414: variable 'ansible_search_path' from source: unknown 30529 1726882614.04613: variable 'omit' from source: magic vars 30529 1726882614.04656: variable 'omit' from source: magic vars 30529 1726882614.04671: variable 'omit' from source: magic vars 30529 1726882614.04674: we have included files to process 30529 1726882614.04675: generating all_blocks data 30529 1726882614.04677: done generating all_blocks data 30529 1726882614.04678: processing included file: fedora.linux_system_roles.network 30529 1726882614.04703: in VariableManager get_vars() 30529 1726882614.04718: done with get_vars() 30529 1726882614.04746: in VariableManager get_vars() 30529 1726882614.04761: done with get_vars() 30529 1726882614.04803: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882614.04925: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882614.05007: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882614.05440: in VariableManager get_vars() 30529 1726882614.05460: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882614.08845: iterating over new_blocks loaded from include file 30529 1726882614.08848: in VariableManager get_vars() 30529 1726882614.08869: done with get_vars() 30529 1726882614.08871: filtering new block on tags 30529 1726882614.09171: done filtering new block on tags 30529 1726882614.09175: in VariableManager get_vars() 30529 1726882614.09196: done with get_vars() 30529 1726882614.09198: filtering new block on tags 30529 1726882614.09214: done filtering new block on tags 30529 1726882614.09217: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882614.09223: extending task lists for all hosts with included blocks 30529 1726882614.09372: done extending task lists 30529 1726882614.09373: done processing included files 30529 1726882614.09374: results queue empty 30529 1726882614.09374: checking for any_errors_fatal 30529 1726882614.09378: done checking for any_errors_fatal 30529 1726882614.09378: checking for max_fail_percentage 30529 1726882614.09379: done checking for max_fail_percentage 30529 1726882614.09380: checking to see if all hosts have failed and the running result is not ok 30529 1726882614.09381: done checking to see if all hosts have failed 30529 1726882614.09382: getting the remaining hosts for this loop 30529 1726882614.09383: done getting the remaining hosts for this loop 30529 1726882614.09385: getting the next task for host managed_node1 30529 1726882614.09392: done getting next task for host managed_node1 30529 1726882614.09396: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882614.09399: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882614.09410: getting variables 30529 1726882614.09411: in VariableManager get_vars() 30529 1726882614.09424: Calling all_inventory to load vars for managed_node1 30529 1726882614.09426: Calling groups_inventory to load vars for managed_node1 30529 1726882614.09428: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.09434: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.09436: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.09439: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.11705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.13365: done with get_vars() 30529 1726882614.13391: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:54 -0400 (0:00:00.160) 0:00:28.160 ****** 30529 1726882614.13477: entering _queue_task() for managed_node1/include_tasks 30529 1726882614.13860: worker is 1 (out of 1 available) 30529 1726882614.13872: exiting _queue_task() for managed_node1/include_tasks 30529 1726882614.14112: done queuing things up, now waiting for results queue to drain 30529 1726882614.14114: waiting for pending results... 30529 1726882614.14312: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882614.14364: in run() - task 12673a56-9f93-b0f1-edc0-000000000b33 30529 1726882614.14403: variable 'ansible_search_path' from source: unknown 30529 1726882614.14407: variable 'ansible_search_path' from source: unknown 30529 1726882614.14447: calling self._execute() 30529 1726882614.14557: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.14561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.14599: variable 'omit' from source: magic vars 30529 1726882614.14956: variable 'ansible_distribution_major_version' from source: facts 30529 1726882614.14974: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882614.14988: _execute() done 30529 1726882614.15004: dumping result to json 30529 1726882614.15107: done dumping result, returning 30529 1726882614.15111: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000000b33] 30529 1726882614.15114: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b33 30529 1726882614.15187: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b33 30529 1726882614.15196: WORKER PROCESS EXITING 30529 1726882614.15258: no more pending results, returning what we have 30529 1726882614.15264: in VariableManager get_vars() 30529 1726882614.15316: Calling all_inventory to load vars for managed_node1 30529 1726882614.15320: Calling groups_inventory to load vars for managed_node1 30529 1726882614.15322: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.15336: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.15340: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.15344: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.16894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.18459: done with get_vars() 30529 1726882614.18477: variable 'ansible_search_path' from source: unknown 30529 1726882614.18478: variable 'ansible_search_path' from source: unknown 30529 1726882614.18637: we have included files to process 30529 1726882614.18639: generating all_blocks data 30529 1726882614.18641: done generating all_blocks data 30529 1726882614.18644: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882614.18645: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882614.18647: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882614.19233: done processing included file 30529 1726882614.19235: iterating over new_blocks loaded from include file 30529 1726882614.19237: in VariableManager get_vars() 30529 1726882614.19260: done with get_vars() 30529 1726882614.19261: filtering new block on tags 30529 1726882614.19295: done filtering new block on tags 30529 1726882614.19298: in VariableManager get_vars() 30529 1726882614.19319: done with get_vars() 30529 1726882614.19321: filtering new block on tags 30529 1726882614.19366: done filtering new block on tags 30529 1726882614.19369: in VariableManager get_vars() 30529 1726882614.19392: done with get_vars() 30529 1726882614.19396: filtering new block on tags 30529 1726882614.19436: done filtering new block on tags 30529 1726882614.19439: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882614.19444: extending task lists for all hosts with included blocks 30529 1726882614.21151: done extending task lists 30529 1726882614.21152: done processing included files 30529 1726882614.21153: results queue empty 30529 1726882614.21154: checking for any_errors_fatal 30529 1726882614.21157: done checking for any_errors_fatal 30529 1726882614.21157: checking for max_fail_percentage 30529 1726882614.21159: done checking for max_fail_percentage 30529 1726882614.21160: checking to see if all hosts have failed and the running result is not ok 30529 1726882614.21161: done checking to see if all hosts have failed 30529 1726882614.21161: getting the remaining hosts for this loop 30529 1726882614.21163: done getting the remaining hosts for this loop 30529 1726882614.21166: getting the next task for host managed_node1 30529 1726882614.21171: done getting next task for host managed_node1 30529 1726882614.21173: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882614.21177: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882614.21188: getting variables 30529 1726882614.21191: in VariableManager get_vars() 30529 1726882614.21206: Calling all_inventory to load vars for managed_node1 30529 1726882614.21208: Calling groups_inventory to load vars for managed_node1 30529 1726882614.21210: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.21215: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.21219: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.21222: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.22518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.24833: done with get_vars() 30529 1726882614.24852: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:54 -0400 (0:00:00.114) 0:00:28.275 ****** 30529 1726882614.24955: entering _queue_task() for managed_node1/setup 30529 1726882614.25309: worker is 1 (out of 1 available) 30529 1726882614.25322: exiting _queue_task() for managed_node1/setup 30529 1726882614.25336: done queuing things up, now waiting for results queue to drain 30529 1726882614.25338: waiting for pending results... 30529 1726882614.25714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882614.25799: in run() - task 12673a56-9f93-b0f1-edc0-000000000b90 30529 1726882614.25822: variable 'ansible_search_path' from source: unknown 30529 1726882614.25836: variable 'ansible_search_path' from source: unknown 30529 1726882614.25941: calling self._execute() 30529 1726882614.25968: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.25977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.25991: variable 'omit' from source: magic vars 30529 1726882614.26352: variable 'ansible_distribution_major_version' from source: facts 30529 1726882614.26368: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882614.26755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882614.28966: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882614.29048: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882614.29097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882614.29144: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882614.29254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882614.29268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882614.29308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882614.29340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882614.29394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882614.29416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882614.29477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882614.29511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882614.29541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882614.29591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882614.29615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882614.29783: variable '__network_required_facts' from source: role '' defaults 30529 1726882614.29998: variable 'ansible_facts' from source: unknown 30529 1726882614.30575: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882614.30585: when evaluation is False, skipping this task 30529 1726882614.30598: _execute() done 30529 1726882614.30607: dumping result to json 30529 1726882614.30615: done dumping result, returning 30529 1726882614.30628: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000000b90] 30529 1726882614.30638: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b90 30529 1726882614.30755: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b90 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882614.30812: no more pending results, returning what we have 30529 1726882614.30817: results queue empty 30529 1726882614.30819: checking for any_errors_fatal 30529 1726882614.30821: done checking for any_errors_fatal 30529 1726882614.30821: checking for max_fail_percentage 30529 1726882614.30823: done checking for max_fail_percentage 30529 1726882614.30824: checking to see if all hosts have failed and the running result is not ok 30529 1726882614.30825: done checking to see if all hosts have failed 30529 1726882614.30826: getting the remaining hosts for this loop 30529 1726882614.30828: done getting the remaining hosts for this loop 30529 1726882614.30832: getting the next task for host managed_node1 30529 1726882614.30846: done getting next task for host managed_node1 30529 1726882614.30850: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882614.30857: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882614.30880: getting variables 30529 1726882614.30882: in VariableManager get_vars() 30529 1726882614.30929: Calling all_inventory to load vars for managed_node1 30529 1726882614.30932: Calling groups_inventory to load vars for managed_node1 30529 1726882614.30934: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.30944: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.30948: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.30951: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.31706: WORKER PROCESS EXITING 30529 1726882614.32691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.34388: done with get_vars() 30529 1726882614.34415: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:54 -0400 (0:00:00.095) 0:00:28.371 ****** 30529 1726882614.34528: entering _queue_task() for managed_node1/stat 30529 1726882614.34868: worker is 1 (out of 1 available) 30529 1726882614.34999: exiting _queue_task() for managed_node1/stat 30529 1726882614.35011: done queuing things up, now waiting for results queue to drain 30529 1726882614.35013: waiting for pending results... 30529 1726882614.35211: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882614.35383: in run() - task 12673a56-9f93-b0f1-edc0-000000000b92 30529 1726882614.35412: variable 'ansible_search_path' from source: unknown 30529 1726882614.35421: variable 'ansible_search_path' from source: unknown 30529 1726882614.35473: calling self._execute() 30529 1726882614.35579: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.35597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.35614: variable 'omit' from source: magic vars 30529 1726882614.35994: variable 'ansible_distribution_major_version' from source: facts 30529 1726882614.36017: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882614.36177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882614.36526: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882614.36530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882614.36562: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882614.36603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882614.36703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882614.36742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882614.36865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882614.36868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882614.36907: variable '__network_is_ostree' from source: set_fact 30529 1726882614.36919: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882614.36927: when evaluation is False, skipping this task 30529 1726882614.36934: _execute() done 30529 1726882614.36941: dumping result to json 30529 1726882614.36949: done dumping result, returning 30529 1726882614.36960: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000000b92] 30529 1726882614.36980: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b92 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882614.37144: no more pending results, returning what we have 30529 1726882614.37148: results queue empty 30529 1726882614.37149: checking for any_errors_fatal 30529 1726882614.37157: done checking for any_errors_fatal 30529 1726882614.37158: checking for max_fail_percentage 30529 1726882614.37160: done checking for max_fail_percentage 30529 1726882614.37161: checking to see if all hosts have failed and the running result is not ok 30529 1726882614.37162: done checking to see if all hosts have failed 30529 1726882614.37163: getting the remaining hosts for this loop 30529 1726882614.37165: done getting the remaining hosts for this loop 30529 1726882614.37169: getting the next task for host managed_node1 30529 1726882614.37179: done getting next task for host managed_node1 30529 1726882614.37183: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882614.37195: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882614.37219: getting variables 30529 1726882614.37222: in VariableManager get_vars() 30529 1726882614.37259: Calling all_inventory to load vars for managed_node1 30529 1726882614.37262: Calling groups_inventory to load vars for managed_node1 30529 1726882614.37265: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.37276: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.37280: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.37284: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.38125: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b92 30529 1726882614.38129: WORKER PROCESS EXITING 30529 1726882614.39068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.40682: done with get_vars() 30529 1726882614.40709: done getting variables 30529 1726882614.40779: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:54 -0400 (0:00:00.062) 0:00:28.434 ****** 30529 1726882614.40821: entering _queue_task() for managed_node1/set_fact 30529 1726882614.41163: worker is 1 (out of 1 available) 30529 1726882614.41175: exiting _queue_task() for managed_node1/set_fact 30529 1726882614.41301: done queuing things up, now waiting for results queue to drain 30529 1726882614.41303: waiting for pending results... 30529 1726882614.41485: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882614.41671: in run() - task 12673a56-9f93-b0f1-edc0-000000000b93 30529 1726882614.41699: variable 'ansible_search_path' from source: unknown 30529 1726882614.41709: variable 'ansible_search_path' from source: unknown 30529 1726882614.41758: calling self._execute() 30529 1726882614.41864: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.41875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.41895: variable 'omit' from source: magic vars 30529 1726882614.42300: variable 'ansible_distribution_major_version' from source: facts 30529 1726882614.42318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882614.42501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882614.42771: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882614.42998: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882614.43001: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882614.43003: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882614.43005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882614.43019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882614.43049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882614.43080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882614.43178: variable '__network_is_ostree' from source: set_fact 30529 1726882614.43192: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882614.43202: when evaluation is False, skipping this task 30529 1726882614.43210: _execute() done 30529 1726882614.43217: dumping result to json 30529 1726882614.43231: done dumping result, returning 30529 1726882614.43244: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000000b93] 30529 1726882614.43254: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b93 30529 1726882614.43470: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b93 30529 1726882614.43473: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882614.43527: no more pending results, returning what we have 30529 1726882614.43531: results queue empty 30529 1726882614.43533: checking for any_errors_fatal 30529 1726882614.43541: done checking for any_errors_fatal 30529 1726882614.43542: checking for max_fail_percentage 30529 1726882614.43544: done checking for max_fail_percentage 30529 1726882614.43545: checking to see if all hosts have failed and the running result is not ok 30529 1726882614.43550: done checking to see if all hosts have failed 30529 1726882614.43551: getting the remaining hosts for this loop 30529 1726882614.43553: done getting the remaining hosts for this loop 30529 1726882614.43557: getting the next task for host managed_node1 30529 1726882614.43571: done getting next task for host managed_node1 30529 1726882614.43575: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882614.43582: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882614.43677: getting variables 30529 1726882614.43679: in VariableManager get_vars() 30529 1726882614.43719: Calling all_inventory to load vars for managed_node1 30529 1726882614.43721: Calling groups_inventory to load vars for managed_node1 30529 1726882614.43723: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882614.43734: Calling all_plugins_play to load vars for managed_node1 30529 1726882614.43737: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882614.43739: Calling groups_plugins_play to load vars for managed_node1 30529 1726882614.45965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882614.47770: done with get_vars() 30529 1726882614.47797: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:54 -0400 (0:00:00.071) 0:00:28.505 ****** 30529 1726882614.47968: entering _queue_task() for managed_node1/service_facts 30529 1726882614.48442: worker is 1 (out of 1 available) 30529 1726882614.48453: exiting _queue_task() for managed_node1/service_facts 30529 1726882614.48465: done queuing things up, now waiting for results queue to drain 30529 1726882614.48467: waiting for pending results... 30529 1726882614.48676: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882614.48951: in run() - task 12673a56-9f93-b0f1-edc0-000000000b95 30529 1726882614.48956: variable 'ansible_search_path' from source: unknown 30529 1726882614.48959: variable 'ansible_search_path' from source: unknown 30529 1726882614.48962: calling self._execute() 30529 1726882614.49006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.49017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.49031: variable 'omit' from source: magic vars 30529 1726882614.49426: variable 'ansible_distribution_major_version' from source: facts 30529 1726882614.49442: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882614.49452: variable 'omit' from source: magic vars 30529 1726882614.49543: variable 'omit' from source: magic vars 30529 1726882614.49579: variable 'omit' from source: magic vars 30529 1726882614.49633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882614.49672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882614.49701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882614.49732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882614.49750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882614.49782: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882614.49816: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.49820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.50098: Set connection var ansible_shell_executable to /bin/sh 30529 1726882614.50101: Set connection var ansible_pipelining to False 30529 1726882614.50103: Set connection var ansible_shell_type to sh 30529 1726882614.50105: Set connection var ansible_timeout to 10 30529 1726882614.50107: Set connection var ansible_connection to ssh 30529 1726882614.50109: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882614.50111: variable 'ansible_shell_executable' from source: unknown 30529 1726882614.50113: variable 'ansible_connection' from source: unknown 30529 1726882614.50116: variable 'ansible_module_compression' from source: unknown 30529 1726882614.50126: variable 'ansible_shell_type' from source: unknown 30529 1726882614.50134: variable 'ansible_shell_executable' from source: unknown 30529 1726882614.50230: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882614.50233: variable 'ansible_pipelining' from source: unknown 30529 1726882614.50235: variable 'ansible_timeout' from source: unknown 30529 1726882614.50236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882614.50601: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882614.50618: variable 'omit' from source: magic vars 30529 1726882614.50685: starting attempt loop 30529 1726882614.50691: running the handler 30529 1726882614.50705: _low_level_execute_command(): starting 30529 1726882614.50717: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882614.52218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882614.52222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.52224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882614.52227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882614.52230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882614.52232: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882614.52234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882614.52237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882614.52239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882614.52241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882614.52243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.52245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882614.52252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882614.52261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882614.52267: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882614.52278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882614.52398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882614.52402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882614.52404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882614.52652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882614.54517: stdout chunk (state=3): >>>/root <<< 30529 1726882614.54520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882614.54523: stdout chunk (state=3): >>><<< 30529 1726882614.54525: stderr chunk (state=3): >>><<< 30529 1726882614.54528: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882614.54531: _low_level_execute_command(): starting 30529 1726882614.54535: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088 `" && echo ansible-tmp-1726882614.5445008-31847-125541229965088="` echo /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088 `" ) && sleep 0' 30529 1726882614.55698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882614.55708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.55725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882614.55739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882614.55901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882614.55905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882614.55955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882614.55977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882614.56042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882614.58154: stdout chunk (state=3): >>>ansible-tmp-1726882614.5445008-31847-125541229965088=/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088 <<< 30529 1726882614.58158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882614.58163: stdout chunk (state=3): >>><<< 30529 1726882614.58165: stderr chunk (state=3): >>><<< 30529 1726882614.58185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882614.5445008-31847-125541229965088=/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882614.58273: variable 'ansible_module_compression' from source: unknown 30529 1726882614.58277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882614.58446: variable 'ansible_facts' from source: unknown 30529 1726882614.58604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py 30529 1726882614.58929: Sending initial data 30529 1726882614.58933: Sent initial data (162 bytes) 30529 1726882614.59788: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882614.59807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.60003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882614.60007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882614.60009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882614.60040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882614.61600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882614.61727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882614.61732: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpt9ng2bcv /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py <<< 30529 1726882614.61734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py" <<< 30529 1726882614.61761: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpt9ng2bcv" to remote "/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py" <<< 30529 1726882614.62986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882614.63055: stderr chunk (state=3): >>><<< 30529 1726882614.63058: stdout chunk (state=3): >>><<< 30529 1726882614.63078: done transferring module to remote 30529 1726882614.63089: _low_level_execute_command(): starting 30529 1726882614.63099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/ /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py && sleep 0' 30529 1726882614.64167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882614.64176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.64234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882614.64241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882614.64244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882614.64246: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882614.64248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882614.64258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882614.64260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882614.64262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882614.64682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882614.64688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882614.64696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882614.66356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882614.66360: stdout chunk (state=3): >>><<< 30529 1726882614.66366: stderr chunk (state=3): >>><<< 30529 1726882614.66381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882614.66384: _low_level_execute_command(): starting 30529 1726882614.66516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/AnsiballZ_service_facts.py && sleep 0' 30529 1726882614.67283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882614.67304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882614.67307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882614.67335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882614.67348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882614.67357: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882614.67367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882614.67380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882614.67462: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882614.67501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882614.67504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882614.68105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.19212: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30529 1726882616.19286: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 30529 1726882616.19318: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882616.20761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882616.20785: stderr chunk (state=3): >>><<< 30529 1726882616.20788: stdout chunk (state=3): >>><<< 30529 1726882616.20818: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882616.21254: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882616.21264: _low_level_execute_command(): starting 30529 1726882616.21267: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882614.5445008-31847-125541229965088/ > /dev/null 2>&1 && sleep 0' 30529 1726882616.21686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.21692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.21697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882616.21699: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882616.21703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.21741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.21759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.21833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.23798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.23802: stdout chunk (state=3): >>><<< 30529 1726882616.23804: stderr chunk (state=3): >>><<< 30529 1726882616.23806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882616.23809: handler run complete 30529 1726882616.23886: variable 'ansible_facts' from source: unknown 30529 1726882616.24044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882616.24561: variable 'ansible_facts' from source: unknown 30529 1726882616.24704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882616.24927: attempt loop complete, returning result 30529 1726882616.24940: _execute() done 30529 1726882616.24943: dumping result to json 30529 1726882616.25009: done dumping result, returning 30529 1726882616.25019: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000000b95] 30529 1726882616.25022: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b95 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882616.25974: no more pending results, returning what we have 30529 1726882616.25977: results queue empty 30529 1726882616.25978: checking for any_errors_fatal 30529 1726882616.25982: done checking for any_errors_fatal 30529 1726882616.25983: checking for max_fail_percentage 30529 1726882616.25985: done checking for max_fail_percentage 30529 1726882616.25986: checking to see if all hosts have failed and the running result is not ok 30529 1726882616.25987: done checking to see if all hosts have failed 30529 1726882616.25987: getting the remaining hosts for this loop 30529 1726882616.25991: done getting the remaining hosts for this loop 30529 1726882616.25997: getting the next task for host managed_node1 30529 1726882616.26006: done getting next task for host managed_node1 30529 1726882616.26010: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882616.26016: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882616.26029: getting variables 30529 1726882616.26031: in VariableManager get_vars() 30529 1726882616.26060: Calling all_inventory to load vars for managed_node1 30529 1726882616.26063: Calling groups_inventory to load vars for managed_node1 30529 1726882616.26065: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882616.26074: Calling all_plugins_play to load vars for managed_node1 30529 1726882616.26077: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882616.26081: Calling groups_plugins_play to load vars for managed_node1 30529 1726882616.26605: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b95 30529 1726882616.26614: WORKER PROCESS EXITING 30529 1726882616.27422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882616.29061: done with get_vars() 30529 1726882616.29083: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:56 -0400 (0:00:01.812) 0:00:30.317 ****** 30529 1726882616.29185: entering _queue_task() for managed_node1/package_facts 30529 1726882616.29524: worker is 1 (out of 1 available) 30529 1726882616.29537: exiting _queue_task() for managed_node1/package_facts 30529 1726882616.29549: done queuing things up, now waiting for results queue to drain 30529 1726882616.29551: waiting for pending results... 30529 1726882616.30011: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882616.30017: in run() - task 12673a56-9f93-b0f1-edc0-000000000b96 30529 1726882616.30031: variable 'ansible_search_path' from source: unknown 30529 1726882616.30039: variable 'ansible_search_path' from source: unknown 30529 1726882616.30076: calling self._execute() 30529 1726882616.30186: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882616.30200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882616.30215: variable 'omit' from source: magic vars 30529 1726882616.30580: variable 'ansible_distribution_major_version' from source: facts 30529 1726882616.30600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882616.30613: variable 'omit' from source: magic vars 30529 1726882616.30705: variable 'omit' from source: magic vars 30529 1726882616.30743: variable 'omit' from source: magic vars 30529 1726882616.30790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882616.30833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882616.30859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882616.30882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882616.31000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882616.31004: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882616.31006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882616.31009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882616.31060: Set connection var ansible_shell_executable to /bin/sh 30529 1726882616.31071: Set connection var ansible_pipelining to False 30529 1726882616.31078: Set connection var ansible_shell_type to sh 30529 1726882616.31091: Set connection var ansible_timeout to 10 30529 1726882616.31100: Set connection var ansible_connection to ssh 30529 1726882616.31113: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882616.31139: variable 'ansible_shell_executable' from source: unknown 30529 1726882616.31146: variable 'ansible_connection' from source: unknown 30529 1726882616.31153: variable 'ansible_module_compression' from source: unknown 30529 1726882616.31159: variable 'ansible_shell_type' from source: unknown 30529 1726882616.31165: variable 'ansible_shell_executable' from source: unknown 30529 1726882616.31171: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882616.31177: variable 'ansible_pipelining' from source: unknown 30529 1726882616.31183: variable 'ansible_timeout' from source: unknown 30529 1726882616.31189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882616.31389: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882616.31409: variable 'omit' from source: magic vars 30529 1726882616.31418: starting attempt loop 30529 1726882616.31424: running the handler 30529 1726882616.31450: _low_level_execute_command(): starting 30529 1726882616.31498: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882616.32322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.32366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882616.32388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.32407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.32485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.34083: stdout chunk (state=3): >>>/root <<< 30529 1726882616.34172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.34214: stderr chunk (state=3): >>><<< 30529 1726882616.34217: stdout chunk (state=3): >>><<< 30529 1726882616.34235: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882616.34245: _low_level_execute_command(): starting 30529 1726882616.34252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722 `" && echo ansible-tmp-1726882616.3423424-31970-131796091436722="` echo /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722 `" ) && sleep 0' 30529 1726882616.34668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.34673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.34676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.34685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.34729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.34733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.34779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.36737: stdout chunk (state=3): >>>ansible-tmp-1726882616.3423424-31970-131796091436722=/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722 <<< 30529 1726882616.36810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.36835: stderr chunk (state=3): >>><<< 30529 1726882616.36934: stdout chunk (state=3): >>><<< 30529 1726882616.36937: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882616.3423424-31970-131796091436722=/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882616.36940: variable 'ansible_module_compression' from source: unknown 30529 1726882616.36978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882616.37076: variable 'ansible_facts' from source: unknown 30529 1726882616.37276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py 30529 1726882616.37484: Sending initial data 30529 1726882616.37487: Sent initial data (162 bytes) 30529 1726882616.37964: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882616.37972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882616.38017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.38025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882616.38134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.38139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.38173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.39698: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882616.39701: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882616.39704: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882616.39716: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882616.39853: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882616.39906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpp7pfgtvj /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py <<< 30529 1726882616.39910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py" <<< 30529 1726882616.39970: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpp7pfgtvj" to remote "/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py" <<< 30529 1726882616.42011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.42014: stdout chunk (state=3): >>><<< 30529 1726882616.42065: stderr chunk (state=3): >>><<< 30529 1726882616.42069: done transferring module to remote 30529 1726882616.42075: _low_level_execute_command(): starting 30529 1726882616.42078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/ /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py && sleep 0' 30529 1726882616.42945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882616.42953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882616.42959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.42961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.42963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.43013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882616.43017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.43071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.44750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.44782: stderr chunk (state=3): >>><<< 30529 1726882616.44785: stdout chunk (state=3): >>><<< 30529 1726882616.44806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882616.44809: _low_level_execute_command(): starting 30529 1726882616.44812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/AnsiballZ_package_facts.py && sleep 0' 30529 1726882616.45506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882616.45509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882616.45512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.45515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882616.45517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882616.45519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.45558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882616.45575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.45602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.45751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.89408: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882616.89472: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30529 1726882616.89538: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30529 1726882616.89639: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30529 1726882616.89647: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882616.89665: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882616.91423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882616.91426: stdout chunk (state=3): >>><<< 30529 1726882616.91429: stderr chunk (state=3): >>><<< 30529 1726882616.91608: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882616.93581: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882616.93617: _low_level_execute_command(): starting 30529 1726882616.93628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882616.3423424-31970-131796091436722/ > /dev/null 2>&1 && sleep 0' 30529 1726882616.94225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882616.94247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882616.94262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882616.94281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882616.94364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882616.94392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882616.94420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882616.94434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882616.94517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882616.96377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882616.96381: stdout chunk (state=3): >>><<< 30529 1726882616.96387: stderr chunk (state=3): >>><<< 30529 1726882616.96408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882616.96499: handler run complete 30529 1726882616.97374: variable 'ansible_facts' from source: unknown 30529 1726882616.97845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882616.99750: variable 'ansible_facts' from source: unknown 30529 1726882617.00168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.00899: attempt loop complete, returning result 30529 1726882617.00917: _execute() done 30529 1726882617.00920: dumping result to json 30529 1726882617.01136: done dumping result, returning 30529 1726882617.01146: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000000b96] 30529 1726882617.01149: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b96 30529 1726882617.03484: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b96 30529 1726882617.03487: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882617.03573: no more pending results, returning what we have 30529 1726882617.03575: results queue empty 30529 1726882617.03576: checking for any_errors_fatal 30529 1726882617.03579: done checking for any_errors_fatal 30529 1726882617.03580: checking for max_fail_percentage 30529 1726882617.03581: done checking for max_fail_percentage 30529 1726882617.03581: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.03582: done checking to see if all hosts have failed 30529 1726882617.03582: getting the remaining hosts for this loop 30529 1726882617.03583: done getting the remaining hosts for this loop 30529 1726882617.03586: getting the next task for host managed_node1 30529 1726882617.03596: done getting next task for host managed_node1 30529 1726882617.03599: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882617.03602: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.03610: getting variables 30529 1726882617.03611: in VariableManager get_vars() 30529 1726882617.03644: Calling all_inventory to load vars for managed_node1 30529 1726882617.03647: Calling groups_inventory to load vars for managed_node1 30529 1726882617.03649: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.03658: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.03660: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.03662: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.04355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.05411: done with get_vars() 30529 1726882617.05437: done getting variables 30529 1726882617.05513: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:57 -0400 (0:00:00.763) 0:00:31.081 ****** 30529 1726882617.05548: entering _queue_task() for managed_node1/debug 30529 1726882617.06249: worker is 1 (out of 1 available) 30529 1726882617.06261: exiting _queue_task() for managed_node1/debug 30529 1726882617.06277: done queuing things up, now waiting for results queue to drain 30529 1726882617.06279: waiting for pending results... 30529 1726882617.06703: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882617.06750: in run() - task 12673a56-9f93-b0f1-edc0-000000000b34 30529 1726882617.06773: variable 'ansible_search_path' from source: unknown 30529 1726882617.06782: variable 'ansible_search_path' from source: unknown 30529 1726882617.06829: calling self._execute() 30529 1726882617.06940: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.06953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.06969: variable 'omit' from source: magic vars 30529 1726882617.07368: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.07384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.07400: variable 'omit' from source: magic vars 30529 1726882617.07464: variable 'omit' from source: magic vars 30529 1726882617.07597: variable 'network_provider' from source: set_fact 30529 1726882617.07624: variable 'omit' from source: magic vars 30529 1726882617.07700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882617.07720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882617.07749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882617.07774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882617.07848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882617.07851: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882617.07854: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.07856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.08023: Set connection var ansible_shell_executable to /bin/sh 30529 1726882617.08027: Set connection var ansible_pipelining to False 30529 1726882617.08029: Set connection var ansible_shell_type to sh 30529 1726882617.08031: Set connection var ansible_timeout to 10 30529 1726882617.08033: Set connection var ansible_connection to ssh 30529 1726882617.08035: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882617.08037: variable 'ansible_shell_executable' from source: unknown 30529 1726882617.08039: variable 'ansible_connection' from source: unknown 30529 1726882617.08041: variable 'ansible_module_compression' from source: unknown 30529 1726882617.08043: variable 'ansible_shell_type' from source: unknown 30529 1726882617.08045: variable 'ansible_shell_executable' from source: unknown 30529 1726882617.08503: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.08507: variable 'ansible_pipelining' from source: unknown 30529 1726882617.08510: variable 'ansible_timeout' from source: unknown 30529 1726882617.08513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.08516: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882617.08518: variable 'omit' from source: magic vars 30529 1726882617.08521: starting attempt loop 30529 1726882617.08523: running the handler 30529 1726882617.08735: handler run complete 30529 1726882617.08756: attempt loop complete, returning result 30529 1726882617.08764: _execute() done 30529 1726882617.08772: dumping result to json 30529 1726882617.08780: done dumping result, returning 30529 1726882617.08796: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000000b34] 30529 1726882617.08836: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b34 ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882617.09008: no more pending results, returning what we have 30529 1726882617.09012: results queue empty 30529 1726882617.09013: checking for any_errors_fatal 30529 1726882617.09022: done checking for any_errors_fatal 30529 1726882617.09023: checking for max_fail_percentage 30529 1726882617.09025: done checking for max_fail_percentage 30529 1726882617.09026: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.09027: done checking to see if all hosts have failed 30529 1726882617.09028: getting the remaining hosts for this loop 30529 1726882617.09030: done getting the remaining hosts for this loop 30529 1726882617.09033: getting the next task for host managed_node1 30529 1726882617.09042: done getting next task for host managed_node1 30529 1726882617.09046: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882617.09053: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.09067: getting variables 30529 1726882617.09069: in VariableManager get_vars() 30529 1726882617.09513: Calling all_inventory to load vars for managed_node1 30529 1726882617.09516: Calling groups_inventory to load vars for managed_node1 30529 1726882617.09519: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.09531: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.09535: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.09538: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.10401: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b34 30529 1726882617.10404: WORKER PROCESS EXITING 30529 1726882617.12225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.14950: done with get_vars() 30529 1726882617.14975: done getting variables 30529 1726882617.15170: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:57 -0400 (0:00:00.097) 0:00:31.178 ****** 30529 1726882617.15254: entering _queue_task() for managed_node1/fail 30529 1726882617.16047: worker is 1 (out of 1 available) 30529 1726882617.16058: exiting _queue_task() for managed_node1/fail 30529 1726882617.16071: done queuing things up, now waiting for results queue to drain 30529 1726882617.16072: waiting for pending results... 30529 1726882617.16480: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882617.16999: in run() - task 12673a56-9f93-b0f1-edc0-000000000b35 30529 1726882617.17003: variable 'ansible_search_path' from source: unknown 30529 1726882617.17006: variable 'ansible_search_path' from source: unknown 30529 1726882617.17009: calling self._execute() 30529 1726882617.17011: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.17013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.17015: variable 'omit' from source: magic vars 30529 1726882617.17858: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.17988: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.18166: variable 'network_state' from source: role '' defaults 30529 1726882617.18185: Evaluated conditional (network_state != {}): False 30529 1726882617.18197: when evaluation is False, skipping this task 30529 1726882617.18206: _execute() done 30529 1726882617.18214: dumping result to json 30529 1726882617.18221: done dumping result, returning 30529 1726882617.18238: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000000b35] 30529 1726882617.18250: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b35 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882617.18466: no more pending results, returning what we have 30529 1726882617.18471: results queue empty 30529 1726882617.18472: checking for any_errors_fatal 30529 1726882617.18480: done checking for any_errors_fatal 30529 1726882617.18480: checking for max_fail_percentage 30529 1726882617.18482: done checking for max_fail_percentage 30529 1726882617.18483: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.18485: done checking to see if all hosts have failed 30529 1726882617.18485: getting the remaining hosts for this loop 30529 1726882617.18487: done getting the remaining hosts for this loop 30529 1726882617.18491: getting the next task for host managed_node1 30529 1726882617.18609: done getting next task for host managed_node1 30529 1726882617.18613: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882617.18619: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.18644: getting variables 30529 1726882617.18647: in VariableManager get_vars() 30529 1726882617.18682: Calling all_inventory to load vars for managed_node1 30529 1726882617.18685: Calling groups_inventory to load vars for managed_node1 30529 1726882617.18688: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.18723: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.18727: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.18827: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.19444: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b35 30529 1726882617.19447: WORKER PROCESS EXITING 30529 1726882617.20656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.34292: done with get_vars() 30529 1726882617.34525: done getting variables 30529 1726882617.34576: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:57 -0400 (0:00:00.193) 0:00:31.372 ****** 30529 1726882617.34612: entering _queue_task() for managed_node1/fail 30529 1726882617.35383: worker is 1 (out of 1 available) 30529 1726882617.35400: exiting _queue_task() for managed_node1/fail 30529 1726882617.35413: done queuing things up, now waiting for results queue to drain 30529 1726882617.35415: waiting for pending results... 30529 1726882617.36185: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882617.36600: in run() - task 12673a56-9f93-b0f1-edc0-000000000b36 30529 1726882617.36606: variable 'ansible_search_path' from source: unknown 30529 1726882617.36609: variable 'ansible_search_path' from source: unknown 30529 1726882617.36799: calling self._execute() 30529 1726882617.36889: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.36963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.36978: variable 'omit' from source: magic vars 30529 1726882617.37799: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.37804: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.38105: variable 'network_state' from source: role '' defaults 30529 1726882617.38368: Evaluated conditional (network_state != {}): False 30529 1726882617.38371: when evaluation is False, skipping this task 30529 1726882617.38374: _execute() done 30529 1726882617.38377: dumping result to json 30529 1726882617.38379: done dumping result, returning 30529 1726882617.38382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000000b36] 30529 1726882617.38385: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b36 30529 1726882617.38467: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b36 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882617.38846: no more pending results, returning what we have 30529 1726882617.38851: results queue empty 30529 1726882617.38852: checking for any_errors_fatal 30529 1726882617.38858: done checking for any_errors_fatal 30529 1726882617.38859: checking for max_fail_percentage 30529 1726882617.38860: done checking for max_fail_percentage 30529 1726882617.38861: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.38862: done checking to see if all hosts have failed 30529 1726882617.38863: getting the remaining hosts for this loop 30529 1726882617.38865: done getting the remaining hosts for this loop 30529 1726882617.38868: getting the next task for host managed_node1 30529 1726882617.38877: done getting next task for host managed_node1 30529 1726882617.38881: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882617.38886: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.38919: getting variables 30529 1726882617.38921: in VariableManager get_vars() 30529 1726882617.38957: Calling all_inventory to load vars for managed_node1 30529 1726882617.38959: Calling groups_inventory to load vars for managed_node1 30529 1726882617.38961: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.38973: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.38976: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.38979: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.39530: WORKER PROCESS EXITING 30529 1726882617.41795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.45012: done with get_vars() 30529 1726882617.45046: done getting variables 30529 1726882617.45322: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:57 -0400 (0:00:00.107) 0:00:31.479 ****** 30529 1726882617.45359: entering _queue_task() for managed_node1/fail 30529 1726882617.46127: worker is 1 (out of 1 available) 30529 1726882617.46142: exiting _queue_task() for managed_node1/fail 30529 1726882617.46155: done queuing things up, now waiting for results queue to drain 30529 1726882617.46157: waiting for pending results... 30529 1726882617.46292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882617.46436: in run() - task 12673a56-9f93-b0f1-edc0-000000000b37 30529 1726882617.46456: variable 'ansible_search_path' from source: unknown 30529 1726882617.46466: variable 'ansible_search_path' from source: unknown 30529 1726882617.46514: calling self._execute() 30529 1726882617.46619: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.46631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.46709: variable 'omit' from source: magic vars 30529 1726882617.47039: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.47053: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.47218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882617.49481: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882617.49568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882617.49614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882617.49664: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882617.49703: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882617.49779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.49860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.49863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.49887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.49912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.50010: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.50031: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882617.50151: variable 'ansible_distribution' from source: facts 30529 1726882617.50161: variable '__network_rh_distros' from source: role '' defaults 30529 1726882617.50499: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882617.50659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.50810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.50846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.50896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.50955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.51099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.51103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.51161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.51206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.51284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.51350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.51453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.51557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.51623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.51645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.51982: variable 'network_connections' from source: include params 30529 1726882617.52005: variable 'interface' from source: play vars 30529 1726882617.52076: variable 'interface' from source: play vars 30529 1726882617.52100: variable 'network_state' from source: role '' defaults 30529 1726882617.52172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882617.52705: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882617.52746: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882617.52780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882617.52825: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882617.52871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882617.52908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882617.53008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.53011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882617.53023: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882617.53031: when evaluation is False, skipping this task 30529 1726882617.53039: _execute() done 30529 1726882617.53047: dumping result to json 30529 1726882617.53054: done dumping result, returning 30529 1726882617.53067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000000b37] 30529 1726882617.53077: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b37 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882617.53266: no more pending results, returning what we have 30529 1726882617.53271: results queue empty 30529 1726882617.53272: checking for any_errors_fatal 30529 1726882617.53277: done checking for any_errors_fatal 30529 1726882617.53278: checking for max_fail_percentage 30529 1726882617.53280: done checking for max_fail_percentage 30529 1726882617.53281: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.53282: done checking to see if all hosts have failed 30529 1726882617.53283: getting the remaining hosts for this loop 30529 1726882617.53284: done getting the remaining hosts for this loop 30529 1726882617.53292: getting the next task for host managed_node1 30529 1726882617.53303: done getting next task for host managed_node1 30529 1726882617.53307: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882617.53313: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.53335: getting variables 30529 1726882617.53338: in VariableManager get_vars() 30529 1726882617.53375: Calling all_inventory to load vars for managed_node1 30529 1726882617.53378: Calling groups_inventory to load vars for managed_node1 30529 1726882617.53381: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.53576: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.53581: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.53588: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b37 30529 1726882617.53595: WORKER PROCESS EXITING 30529 1726882617.53600: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.55251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.56800: done with get_vars() 30529 1726882617.56821: done getting variables 30529 1726882617.56878: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:57 -0400 (0:00:00.115) 0:00:31.595 ****** 30529 1726882617.56915: entering _queue_task() for managed_node1/dnf 30529 1726882617.57332: worker is 1 (out of 1 available) 30529 1726882617.57345: exiting _queue_task() for managed_node1/dnf 30529 1726882617.57356: done queuing things up, now waiting for results queue to drain 30529 1726882617.57358: waiting for pending results... 30529 1726882617.57560: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882617.57718: in run() - task 12673a56-9f93-b0f1-edc0-000000000b38 30529 1726882617.57739: variable 'ansible_search_path' from source: unknown 30529 1726882617.57747: variable 'ansible_search_path' from source: unknown 30529 1726882617.57783: calling self._execute() 30529 1726882617.57879: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.57895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.57915: variable 'omit' from source: magic vars 30529 1726882617.58295: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.58313: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.58526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882617.60737: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882617.60820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882617.60866: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882617.60952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882617.60956: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882617.61028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.61066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.61100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.61144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.61165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.61295: variable 'ansible_distribution' from source: facts 30529 1726882617.61395: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.61398: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882617.61443: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882617.61582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.61620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.61649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.61695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.61719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.61761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.61788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.61825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.61867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.61886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.61937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.62041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.62045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.62047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.62054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.62218: variable 'network_connections' from source: include params 30529 1726882617.62234: variable 'interface' from source: play vars 30529 1726882617.62308: variable 'interface' from source: play vars 30529 1726882617.62396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882617.62579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882617.62633: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882617.62668: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882617.62708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882617.62755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882617.62781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882617.62898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.62902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882617.62922: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882617.63168: variable 'network_connections' from source: include params 30529 1726882617.63179: variable 'interface' from source: play vars 30529 1726882617.63249: variable 'interface' from source: play vars 30529 1726882617.63286: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882617.63299: when evaluation is False, skipping this task 30529 1726882617.63306: _execute() done 30529 1726882617.63313: dumping result to json 30529 1726882617.63320: done dumping result, returning 30529 1726882617.63331: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000b38] 30529 1726882617.63340: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b38 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882617.63645: no more pending results, returning what we have 30529 1726882617.63650: results queue empty 30529 1726882617.63651: checking for any_errors_fatal 30529 1726882617.63659: done checking for any_errors_fatal 30529 1726882617.63660: checking for max_fail_percentage 30529 1726882617.63662: done checking for max_fail_percentage 30529 1726882617.63663: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.63664: done checking to see if all hosts have failed 30529 1726882617.63665: getting the remaining hosts for this loop 30529 1726882617.63667: done getting the remaining hosts for this loop 30529 1726882617.63671: getting the next task for host managed_node1 30529 1726882617.63680: done getting next task for host managed_node1 30529 1726882617.63684: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882617.63692: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.63717: getting variables 30529 1726882617.63719: in VariableManager get_vars() 30529 1726882617.63755: Calling all_inventory to load vars for managed_node1 30529 1726882617.63758: Calling groups_inventory to load vars for managed_node1 30529 1726882617.63760: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.63771: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.63774: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.63777: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.64328: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b38 30529 1726882617.64331: WORKER PROCESS EXITING 30529 1726882617.65281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.66885: done with get_vars() 30529 1726882617.66910: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882617.66982: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:57 -0400 (0:00:00.101) 0:00:31.696 ****** 30529 1726882617.67020: entering _queue_task() for managed_node1/yum 30529 1726882617.67428: worker is 1 (out of 1 available) 30529 1726882617.67439: exiting _queue_task() for managed_node1/yum 30529 1726882617.67449: done queuing things up, now waiting for results queue to drain 30529 1726882617.67450: waiting for pending results... 30529 1726882617.67647: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882617.67797: in run() - task 12673a56-9f93-b0f1-edc0-000000000b39 30529 1726882617.67818: variable 'ansible_search_path' from source: unknown 30529 1726882617.67825: variable 'ansible_search_path' from source: unknown 30529 1726882617.67862: calling self._execute() 30529 1726882617.67959: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.67970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.67985: variable 'omit' from source: magic vars 30529 1726882617.68376: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.68395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.68588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882617.71469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882617.71551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882617.71606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882617.71702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882617.71726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882617.71826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.71863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.71903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.71999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.72031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.72144: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.72178: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882617.72197: when evaluation is False, skipping this task 30529 1726882617.72213: _execute() done 30529 1726882617.72225: dumping result to json 30529 1726882617.72251: done dumping result, returning 30529 1726882617.72266: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000b39] 30529 1726882617.72278: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b39 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882617.72629: no more pending results, returning what we have 30529 1726882617.72633: results queue empty 30529 1726882617.72634: checking for any_errors_fatal 30529 1726882617.72641: done checking for any_errors_fatal 30529 1726882617.72642: checking for max_fail_percentage 30529 1726882617.72647: done checking for max_fail_percentage 30529 1726882617.72648: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.72651: done checking to see if all hosts have failed 30529 1726882617.72651: getting the remaining hosts for this loop 30529 1726882617.72654: done getting the remaining hosts for this loop 30529 1726882617.72659: getting the next task for host managed_node1 30529 1726882617.72671: done getting next task for host managed_node1 30529 1726882617.72677: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882617.72686: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.72737: getting variables 30529 1726882617.72745: in VariableManager get_vars() 30529 1726882617.72783: Calling all_inventory to load vars for managed_node1 30529 1726882617.72785: Calling groups_inventory to load vars for managed_node1 30529 1726882617.72788: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.72798: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b39 30529 1726882617.72802: WORKER PROCESS EXITING 30529 1726882617.72928: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.72935: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.72942: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.74639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.76374: done with get_vars() 30529 1726882617.76397: done getting variables 30529 1726882617.76455: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:57 -0400 (0:00:00.094) 0:00:31.790 ****** 30529 1726882617.76488: entering _queue_task() for managed_node1/fail 30529 1726882617.77329: worker is 1 (out of 1 available) 30529 1726882617.77341: exiting _queue_task() for managed_node1/fail 30529 1726882617.77353: done queuing things up, now waiting for results queue to drain 30529 1726882617.77355: waiting for pending results... 30529 1726882617.77607: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882617.77944: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3a 30529 1726882617.77948: variable 'ansible_search_path' from source: unknown 30529 1726882617.77957: variable 'ansible_search_path' from source: unknown 30529 1726882617.77978: calling self._execute() 30529 1726882617.78090: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.78161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.78164: variable 'omit' from source: magic vars 30529 1726882617.78515: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.78533: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.78661: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882617.78872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882617.81116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882617.81195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882617.81244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882617.81328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882617.81331: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882617.81404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.81446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.81475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.81522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.81550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.81601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.81655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.81669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.81714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.81763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.81784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.81814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.81843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.81892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.81982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.82103: variable 'network_connections' from source: include params 30529 1726882617.82119: variable 'interface' from source: play vars 30529 1726882617.82188: variable 'interface' from source: play vars 30529 1726882617.82267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882617.82442: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882617.82492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882617.82537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882617.82569: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882617.82618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882617.82651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882617.82745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.82748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882617.82779: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882617.83031: variable 'network_connections' from source: include params 30529 1726882617.83041: variable 'interface' from source: play vars 30529 1726882617.83110: variable 'interface' from source: play vars 30529 1726882617.83147: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882617.83156: when evaluation is False, skipping this task 30529 1726882617.83164: _execute() done 30529 1726882617.83177: dumping result to json 30529 1726882617.83189: done dumping result, returning 30529 1726882617.83205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000b3a] 30529 1726882617.83399: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3a 30529 1726882617.83485: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3a 30529 1726882617.83489: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882617.83544: no more pending results, returning what we have 30529 1726882617.83548: results queue empty 30529 1726882617.83549: checking for any_errors_fatal 30529 1726882617.83555: done checking for any_errors_fatal 30529 1726882617.83556: checking for max_fail_percentage 30529 1726882617.83558: done checking for max_fail_percentage 30529 1726882617.83559: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.83561: done checking to see if all hosts have failed 30529 1726882617.83561: getting the remaining hosts for this loop 30529 1726882617.83563: done getting the remaining hosts for this loop 30529 1726882617.83567: getting the next task for host managed_node1 30529 1726882617.83577: done getting next task for host managed_node1 30529 1726882617.83581: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882617.83588: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.83618: getting variables 30529 1726882617.83621: in VariableManager get_vars() 30529 1726882617.83657: Calling all_inventory to load vars for managed_node1 30529 1726882617.83660: Calling groups_inventory to load vars for managed_node1 30529 1726882617.83662: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.83674: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.83677: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.83680: Calling groups_plugins_play to load vars for managed_node1 30529 1726882617.85239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882617.86843: done with get_vars() 30529 1726882617.86872: done getting variables 30529 1726882617.86932: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:57 -0400 (0:00:00.104) 0:00:31.895 ****** 30529 1726882617.86967: entering _queue_task() for managed_node1/package 30529 1726882617.87428: worker is 1 (out of 1 available) 30529 1726882617.87438: exiting _queue_task() for managed_node1/package 30529 1726882617.87449: done queuing things up, now waiting for results queue to drain 30529 1726882617.87450: waiting for pending results... 30529 1726882617.87692: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882617.87900: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3b 30529 1726882617.87905: variable 'ansible_search_path' from source: unknown 30529 1726882617.87908: variable 'ansible_search_path' from source: unknown 30529 1726882617.87912: calling self._execute() 30529 1726882617.87971: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882617.87983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882617.88009: variable 'omit' from source: magic vars 30529 1726882617.88397: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.88417: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882617.88624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882617.88913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882617.88961: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882617.89007: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882617.89404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882617.89529: variable 'network_packages' from source: role '' defaults 30529 1726882617.89646: variable '__network_provider_setup' from source: role '' defaults 30529 1726882617.89747: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882617.89750: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882617.89753: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882617.89810: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882617.90009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882617.91983: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882617.92058: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882617.92102: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882617.92142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882617.92178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882617.92267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.92314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.92346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.92399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.92500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.92503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.92506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.92531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.92574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.92592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.92825: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882617.92949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.92978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.93010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.93059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.93164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.93176: variable 'ansible_python' from source: facts 30529 1726882617.93202: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882617.93294: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882617.93373: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882617.93513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.93542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.93571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.93624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.93644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.93719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882617.93738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882617.93798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.93812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882617.93838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882617.93988: variable 'network_connections' from source: include params 30529 1726882617.94003: variable 'interface' from source: play vars 30529 1726882617.94112: variable 'interface' from source: play vars 30529 1726882617.94199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882617.94264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882617.94268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882617.94306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882617.94356: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882617.94660: variable 'network_connections' from source: include params 30529 1726882617.94671: variable 'interface' from source: play vars 30529 1726882617.94799: variable 'interface' from source: play vars 30529 1726882617.94842: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882617.94931: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882617.95260: variable 'network_connections' from source: include params 30529 1726882617.95399: variable 'interface' from source: play vars 30529 1726882617.95403: variable 'interface' from source: play vars 30529 1726882617.95405: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882617.95444: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882617.95772: variable 'network_connections' from source: include params 30529 1726882617.95782: variable 'interface' from source: play vars 30529 1726882617.95856: variable 'interface' from source: play vars 30529 1726882617.95921: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882617.95990: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882617.96008: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882617.96077: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882617.96313: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882617.96912: variable 'network_connections' from source: include params 30529 1726882617.96915: variable 'interface' from source: play vars 30529 1726882617.96917: variable 'interface' from source: play vars 30529 1726882617.96920: variable 'ansible_distribution' from source: facts 30529 1726882617.96922: variable '__network_rh_distros' from source: role '' defaults 30529 1726882617.96924: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.97100: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882617.97333: variable 'ansible_distribution' from source: facts 30529 1726882617.97702: variable '__network_rh_distros' from source: role '' defaults 30529 1726882617.97705: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.97708: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882617.97789: variable 'ansible_distribution' from source: facts 30529 1726882617.97804: variable '__network_rh_distros' from source: role '' defaults 30529 1726882617.97819: variable 'ansible_distribution_major_version' from source: facts 30529 1726882617.97857: variable 'network_provider' from source: set_fact 30529 1726882617.97877: variable 'ansible_facts' from source: unknown 30529 1726882617.99209: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882617.99398: when evaluation is False, skipping this task 30529 1726882617.99401: _execute() done 30529 1726882617.99404: dumping result to json 30529 1726882617.99406: done dumping result, returning 30529 1726882617.99409: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000000b3b] 30529 1726882617.99411: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3b skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882617.99544: no more pending results, returning what we have 30529 1726882617.99550: results queue empty 30529 1726882617.99551: checking for any_errors_fatal 30529 1726882617.99556: done checking for any_errors_fatal 30529 1726882617.99557: checking for max_fail_percentage 30529 1726882617.99559: done checking for max_fail_percentage 30529 1726882617.99560: checking to see if all hosts have failed and the running result is not ok 30529 1726882617.99561: done checking to see if all hosts have failed 30529 1726882617.99562: getting the remaining hosts for this loop 30529 1726882617.99564: done getting the remaining hosts for this loop 30529 1726882617.99568: getting the next task for host managed_node1 30529 1726882617.99578: done getting next task for host managed_node1 30529 1726882617.99582: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882617.99588: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882617.99612: getting variables 30529 1726882617.99615: in VariableManager get_vars() 30529 1726882617.99652: Calling all_inventory to load vars for managed_node1 30529 1726882617.99655: Calling groups_inventory to load vars for managed_node1 30529 1726882617.99662: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882617.99674: Calling all_plugins_play to load vars for managed_node1 30529 1726882617.99677: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882617.99681: Calling groups_plugins_play to load vars for managed_node1 30529 1726882618.00550: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3b 30529 1726882618.00554: WORKER PROCESS EXITING 30529 1726882618.03081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882618.06386: done with get_vars() 30529 1726882618.06412: done getting variables 30529 1726882618.06555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:58 -0400 (0:00:00.196) 0:00:32.091 ****** 30529 1726882618.06590: entering _queue_task() for managed_node1/package 30529 1726882618.07366: worker is 1 (out of 1 available) 30529 1726882618.07378: exiting _queue_task() for managed_node1/package 30529 1726882618.07391: done queuing things up, now waiting for results queue to drain 30529 1726882618.07507: waiting for pending results... 30529 1726882618.08112: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882618.08499: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3c 30529 1726882618.08503: variable 'ansible_search_path' from source: unknown 30529 1726882618.08506: variable 'ansible_search_path' from source: unknown 30529 1726882618.08508: calling self._execute() 30529 1726882618.08510: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.08512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.08515: variable 'omit' from source: magic vars 30529 1726882618.09242: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.09513: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882618.09632: variable 'network_state' from source: role '' defaults 30529 1726882618.09649: Evaluated conditional (network_state != {}): False 30529 1726882618.09656: when evaluation is False, skipping this task 30529 1726882618.09663: _execute() done 30529 1726882618.09669: dumping result to json 30529 1726882618.09676: done dumping result, returning 30529 1726882618.09687: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000b3c] 30529 1726882618.09703: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882618.09861: no more pending results, returning what we have 30529 1726882618.09865: results queue empty 30529 1726882618.09866: checking for any_errors_fatal 30529 1726882618.09873: done checking for any_errors_fatal 30529 1726882618.09873: checking for max_fail_percentage 30529 1726882618.09875: done checking for max_fail_percentage 30529 1726882618.09876: checking to see if all hosts have failed and the running result is not ok 30529 1726882618.09877: done checking to see if all hosts have failed 30529 1726882618.09878: getting the remaining hosts for this loop 30529 1726882618.09879: done getting the remaining hosts for this loop 30529 1726882618.09883: getting the next task for host managed_node1 30529 1726882618.09892: done getting next task for host managed_node1 30529 1726882618.09898: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882618.09953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882618.09983: getting variables 30529 1726882618.09985: in VariableManager get_vars() 30529 1726882618.10109: Calling all_inventory to load vars for managed_node1 30529 1726882618.10112: Calling groups_inventory to load vars for managed_node1 30529 1726882618.10115: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882618.10133: Calling all_plugins_play to load vars for managed_node1 30529 1726882618.10137: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882618.10141: Calling groups_plugins_play to load vars for managed_node1 30529 1726882618.10706: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3c 30529 1726882618.10710: WORKER PROCESS EXITING 30529 1726882618.12657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882618.15067: done with get_vars() 30529 1726882618.15096: done getting variables 30529 1726882618.15157: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:58 -0400 (0:00:00.086) 0:00:32.178 ****** 30529 1726882618.15199: entering _queue_task() for managed_node1/package 30529 1726882618.15551: worker is 1 (out of 1 available) 30529 1726882618.15564: exiting _queue_task() for managed_node1/package 30529 1726882618.15577: done queuing things up, now waiting for results queue to drain 30529 1726882618.15579: waiting for pending results... 30529 1726882618.15810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882618.15957: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3d 30529 1726882618.15978: variable 'ansible_search_path' from source: unknown 30529 1726882618.15988: variable 'ansible_search_path' from source: unknown 30529 1726882618.16032: calling self._execute() 30529 1726882618.16499: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.16502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.16507: variable 'omit' from source: magic vars 30529 1726882618.17048: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.17065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882618.17196: variable 'network_state' from source: role '' defaults 30529 1726882618.17330: Evaluated conditional (network_state != {}): False 30529 1726882618.17365: when evaluation is False, skipping this task 30529 1726882618.17374: _execute() done 30529 1726882618.17382: dumping result to json 30529 1726882618.17395: done dumping result, returning 30529 1726882618.17409: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000b3d] 30529 1726882618.17419: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882618.17599: no more pending results, returning what we have 30529 1726882618.17604: results queue empty 30529 1726882618.17605: checking for any_errors_fatal 30529 1726882618.17610: done checking for any_errors_fatal 30529 1726882618.17611: checking for max_fail_percentage 30529 1726882618.17613: done checking for max_fail_percentage 30529 1726882618.17613: checking to see if all hosts have failed and the running result is not ok 30529 1726882618.17614: done checking to see if all hosts have failed 30529 1726882618.17615: getting the remaining hosts for this loop 30529 1726882618.17617: done getting the remaining hosts for this loop 30529 1726882618.17621: getting the next task for host managed_node1 30529 1726882618.17629: done getting next task for host managed_node1 30529 1726882618.17633: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882618.17639: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882618.17656: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3d 30529 1726882618.17659: WORKER PROCESS EXITING 30529 1726882618.17717: getting variables 30529 1726882618.17719: in VariableManager get_vars() 30529 1726882618.17752: Calling all_inventory to load vars for managed_node1 30529 1726882618.17754: Calling groups_inventory to load vars for managed_node1 30529 1726882618.17756: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882618.17768: Calling all_plugins_play to load vars for managed_node1 30529 1726882618.17771: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882618.17774: Calling groups_plugins_play to load vars for managed_node1 30529 1726882618.19081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882618.22018: done with get_vars() 30529 1726882618.22044: done getting variables 30529 1726882618.22111: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:58 -0400 (0:00:00.069) 0:00:32.247 ****** 30529 1726882618.22149: entering _queue_task() for managed_node1/service 30529 1726882618.22918: worker is 1 (out of 1 available) 30529 1726882618.22931: exiting _queue_task() for managed_node1/service 30529 1726882618.22943: done queuing things up, now waiting for results queue to drain 30529 1726882618.22945: waiting for pending results... 30529 1726882618.23404: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882618.23871: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3e 30529 1726882618.24000: variable 'ansible_search_path' from source: unknown 30529 1726882618.24004: variable 'ansible_search_path' from source: unknown 30529 1726882618.24008: calling self._execute() 30529 1726882618.24134: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.24138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.24150: variable 'omit' from source: magic vars 30529 1726882618.24830: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.24844: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882618.25206: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882618.25580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882618.30901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882618.31612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882618.31660: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882618.31894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882618.31898: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882618.32102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.32106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.32109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.32244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.32263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.32402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.32441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.32470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.32578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.32599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.32687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.32780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.32811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.32907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.32990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.33284: variable 'network_connections' from source: include params 30529 1726882618.33498: variable 'interface' from source: play vars 30529 1726882618.33501: variable 'interface' from source: play vars 30529 1726882618.33661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882618.34166: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882618.34170: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882618.34172: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882618.34282: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882618.34349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882618.34602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882618.34605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.34608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882618.34732: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882618.35208: variable 'network_connections' from source: include params 30529 1726882618.35259: variable 'interface' from source: play vars 30529 1726882618.35415: variable 'interface' from source: play vars 30529 1726882618.35452: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882618.35505: when evaluation is False, skipping this task 30529 1726882618.35512: _execute() done 30529 1726882618.35519: dumping result to json 30529 1726882618.35526: done dumping result, returning 30529 1726882618.35537: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000b3e] 30529 1726882618.35546: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3e 30529 1726882618.35867: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3e 30529 1726882618.35878: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882618.35930: no more pending results, returning what we have 30529 1726882618.35934: results queue empty 30529 1726882618.35935: checking for any_errors_fatal 30529 1726882618.35941: done checking for any_errors_fatal 30529 1726882618.35942: checking for max_fail_percentage 30529 1726882618.35944: done checking for max_fail_percentage 30529 1726882618.35945: checking to see if all hosts have failed and the running result is not ok 30529 1726882618.35946: done checking to see if all hosts have failed 30529 1726882618.35946: getting the remaining hosts for this loop 30529 1726882618.35948: done getting the remaining hosts for this loop 30529 1726882618.35952: getting the next task for host managed_node1 30529 1726882618.35963: done getting next task for host managed_node1 30529 1726882618.35967: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882618.35972: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882618.35996: getting variables 30529 1726882618.35998: in VariableManager get_vars() 30529 1726882618.36047: Calling all_inventory to load vars for managed_node1 30529 1726882618.36050: Calling groups_inventory to load vars for managed_node1 30529 1726882618.36053: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882618.36064: Calling all_plugins_play to load vars for managed_node1 30529 1726882618.36067: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882618.36070: Calling groups_plugins_play to load vars for managed_node1 30529 1726882618.39587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882618.42791: done with get_vars() 30529 1726882618.42921: done getting variables 30529 1726882618.42986: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:58 -0400 (0:00:00.208) 0:00:32.456 ****** 30529 1726882618.43024: entering _queue_task() for managed_node1/service 30529 1726882618.43784: worker is 1 (out of 1 available) 30529 1726882618.43910: exiting _queue_task() for managed_node1/service 30529 1726882618.43923: done queuing things up, now waiting for results queue to drain 30529 1726882618.43925: waiting for pending results... 30529 1726882618.44413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882618.44576: in run() - task 12673a56-9f93-b0f1-edc0-000000000b3f 30529 1726882618.44599: variable 'ansible_search_path' from source: unknown 30529 1726882618.44603: variable 'ansible_search_path' from source: unknown 30529 1726882618.44750: calling self._execute() 30529 1726882618.44962: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.44970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.44976: variable 'omit' from source: magic vars 30529 1726882618.45798: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.45802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882618.46175: variable 'network_provider' from source: set_fact 30529 1726882618.46184: variable 'network_state' from source: role '' defaults 30529 1726882618.46187: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882618.46200: variable 'omit' from source: magic vars 30529 1726882618.46317: variable 'omit' from source: magic vars 30529 1726882618.46345: variable 'network_service_name' from source: role '' defaults 30529 1726882618.46598: variable 'network_service_name' from source: role '' defaults 30529 1726882618.46746: variable '__network_provider_setup' from source: role '' defaults 30529 1726882618.46751: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882618.46930: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882618.46943: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882618.47001: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882618.47517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882618.51950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882618.52176: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882618.52216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882618.52253: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882618.52359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882618.52521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.52551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.52585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.52898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.52902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.52904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.52929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.52954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.52996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.53013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.53538: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882618.53766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.54000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.54003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.54006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.54008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.54182: variable 'ansible_python' from source: facts 30529 1726882618.54203: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882618.54375: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882618.54602: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882618.54889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.54919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.54943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.55098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.55115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.55160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882618.55183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882618.55411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.55415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882618.55417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882618.55745: variable 'network_connections' from source: include params 30529 1726882618.55754: variable 'interface' from source: play vars 30529 1726882618.55831: variable 'interface' from source: play vars 30529 1726882618.56110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882618.56523: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882618.56568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882618.56715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882618.56754: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882618.56999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882618.57002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882618.57015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882618.57112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882618.57271: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882618.57767: variable 'network_connections' from source: include params 30529 1726882618.57773: variable 'interface' from source: play vars 30529 1726882618.57939: variable 'interface' from source: play vars 30529 1726882618.57983: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882618.58237: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882618.58900: variable 'network_connections' from source: include params 30529 1726882618.58904: variable 'interface' from source: play vars 30529 1726882618.58982: variable 'interface' from source: play vars 30529 1726882618.59076: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882618.59335: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882618.59619: variable 'network_connections' from source: include params 30529 1726882618.59630: variable 'interface' from source: play vars 30529 1726882618.59708: variable 'interface' from source: play vars 30529 1726882618.59771: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882618.59838: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882618.59878: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882618.59922: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882618.60146: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882618.60852: variable 'network_connections' from source: include params 30529 1726882618.60963: variable 'interface' from source: play vars 30529 1726882618.60966: variable 'interface' from source: play vars 30529 1726882618.60968: variable 'ansible_distribution' from source: facts 30529 1726882618.60972: variable '__network_rh_distros' from source: role '' defaults 30529 1726882618.60974: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.61003: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882618.61184: variable 'ansible_distribution' from source: facts 30529 1726882618.61199: variable '__network_rh_distros' from source: role '' defaults 30529 1726882618.61211: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.61226: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882618.61407: variable 'ansible_distribution' from source: facts 30529 1726882618.61420: variable '__network_rh_distros' from source: role '' defaults 30529 1726882618.61430: variable 'ansible_distribution_major_version' from source: facts 30529 1726882618.61508: variable 'network_provider' from source: set_fact 30529 1726882618.61511: variable 'omit' from source: magic vars 30529 1726882618.61532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882618.61563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882618.61586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882618.61617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882618.61637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882618.61725: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882618.61728: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.61731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.61797: Set connection var ansible_shell_executable to /bin/sh 30529 1726882618.61809: Set connection var ansible_pipelining to False 30529 1726882618.61816: Set connection var ansible_shell_type to sh 30529 1726882618.61834: Set connection var ansible_timeout to 10 30529 1726882618.61845: Set connection var ansible_connection to ssh 30529 1726882618.61856: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882618.61885: variable 'ansible_shell_executable' from source: unknown 30529 1726882618.61894: variable 'ansible_connection' from source: unknown 30529 1726882618.61902: variable 'ansible_module_compression' from source: unknown 30529 1726882618.61909: variable 'ansible_shell_type' from source: unknown 30529 1726882618.61941: variable 'ansible_shell_executable' from source: unknown 30529 1726882618.61944: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882618.61946: variable 'ansible_pipelining' from source: unknown 30529 1726882618.61948: variable 'ansible_timeout' from source: unknown 30529 1726882618.61954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882618.62062: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882618.62160: variable 'omit' from source: magic vars 30529 1726882618.62163: starting attempt loop 30529 1726882618.62165: running the handler 30529 1726882618.62180: variable 'ansible_facts' from source: unknown 30529 1726882618.63301: _low_level_execute_command(): starting 30529 1726882618.63313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882618.63967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882618.64016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.64091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882618.64116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882618.64225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882618.64253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882618.65970: stdout chunk (state=3): >>>/root <<< 30529 1726882618.66130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882618.66133: stdout chunk (state=3): >>><<< 30529 1726882618.66136: stderr chunk (state=3): >>><<< 30529 1726882618.66158: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882618.66249: _low_level_execute_command(): starting 30529 1726882618.66253: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683 `" && echo ansible-tmp-1726882618.6616366-32073-94970236537683="` echo /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683 `" ) && sleep 0' 30529 1726882618.66813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.66835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882618.66848: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882618.66858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882618.66871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882618.66919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.66978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882618.67017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882618.67083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882618.69065: stdout chunk (state=3): >>>ansible-tmp-1726882618.6616366-32073-94970236537683=/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683 <<< 30529 1726882618.69129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882618.69199: stdout chunk (state=3): >>><<< 30529 1726882618.69203: stderr chunk (state=3): >>><<< 30529 1726882618.69220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882618.6616366-32073-94970236537683=/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882618.69256: variable 'ansible_module_compression' from source: unknown 30529 1726882618.69503: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882618.69540: variable 'ansible_facts' from source: unknown 30529 1726882618.69982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py 30529 1726882618.70167: Sending initial data 30529 1726882618.70176: Sent initial data (155 bytes) 30529 1726882618.70769: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882618.70817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.70836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882618.70928: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882618.70953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882618.71031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882618.72563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882618.72656: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882618.72677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882618.72811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcwsr5b58 /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py <<< 30529 1726882618.72816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcwsr5b58" to remote "/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py" <<< 30529 1726882618.75741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882618.75812: stderr chunk (state=3): >>><<< 30529 1726882618.75815: stdout chunk (state=3): >>><<< 30529 1726882618.75826: done transferring module to remote 30529 1726882618.75836: _low_level_execute_command(): starting 30529 1726882618.75841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/ /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py && sleep 0' 30529 1726882618.76484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882618.76498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882618.76510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882618.76525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882618.76570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882618.76573: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882618.76576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.76614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.76658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882618.76687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882618.76690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882618.76774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882618.78496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882618.78519: stderr chunk (state=3): >>><<< 30529 1726882618.78522: stdout chunk (state=3): >>><<< 30529 1726882618.78594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882618.78598: _low_level_execute_command(): starting 30529 1726882618.78601: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/AnsiballZ_systemd.py && sleep 0' 30529 1726882618.79248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882618.79265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882618.79361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882618.79400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882618.79417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882618.79440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882618.79546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.08042: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10838016", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312398336", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1744339000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882619.08077: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882619.08092: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882619.09753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882619.09776: stderr chunk (state=3): >>><<< 30529 1726882619.09779: stdout chunk (state=3): >>><<< 30529 1726882619.09803: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10838016", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312398336", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1744339000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882619.09941: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882619.09956: _low_level_execute_command(): starting 30529 1726882619.09961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882618.6616366-32073-94970236537683/ > /dev/null 2>&1 && sleep 0' 30529 1726882619.10815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882619.10903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882619.10910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.10942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.10947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882619.10966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.11076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.13089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.13098: stdout chunk (state=3): >>><<< 30529 1726882619.13109: stderr chunk (state=3): >>><<< 30529 1726882619.13125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882619.13132: handler run complete 30529 1726882619.13199: attempt loop complete, returning result 30529 1726882619.13405: _execute() done 30529 1726882619.13409: dumping result to json 30529 1726882619.13483: done dumping result, returning 30529 1726882619.13487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000000b3f] 30529 1726882619.13489: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3f ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882619.13908: no more pending results, returning what we have 30529 1726882619.13921: results queue empty 30529 1726882619.13922: checking for any_errors_fatal 30529 1726882619.13926: done checking for any_errors_fatal 30529 1726882619.13927: checking for max_fail_percentage 30529 1726882619.13929: done checking for max_fail_percentage 30529 1726882619.13929: checking to see if all hosts have failed and the running result is not ok 30529 1726882619.13930: done checking to see if all hosts have failed 30529 1726882619.13931: getting the remaining hosts for this loop 30529 1726882619.13933: done getting the remaining hosts for this loop 30529 1726882619.13936: getting the next task for host managed_node1 30529 1726882619.13944: done getting next task for host managed_node1 30529 1726882619.13947: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882619.13951: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882619.13963: getting variables 30529 1726882619.13966: in VariableManager get_vars() 30529 1726882619.14101: Calling all_inventory to load vars for managed_node1 30529 1726882619.14104: Calling groups_inventory to load vars for managed_node1 30529 1726882619.14108: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882619.14115: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b3f 30529 1726882619.14118: WORKER PROCESS EXITING 30529 1726882619.14198: Calling all_plugins_play to load vars for managed_node1 30529 1726882619.14203: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882619.14207: Calling groups_plugins_play to load vars for managed_node1 30529 1726882619.15858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882619.17575: done with get_vars() 30529 1726882619.17602: done getting variables 30529 1726882619.17662: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:59 -0400 (0:00:00.746) 0:00:33.203 ****** 30529 1726882619.17709: entering _queue_task() for managed_node1/service 30529 1726882619.18055: worker is 1 (out of 1 available) 30529 1726882619.18067: exiting _queue_task() for managed_node1/service 30529 1726882619.18079: done queuing things up, now waiting for results queue to drain 30529 1726882619.18081: waiting for pending results... 30529 1726882619.18413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882619.18536: in run() - task 12673a56-9f93-b0f1-edc0-000000000b40 30529 1726882619.18557: variable 'ansible_search_path' from source: unknown 30529 1726882619.18562: variable 'ansible_search_path' from source: unknown 30529 1726882619.18601: calling self._execute() 30529 1726882619.18701: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.18705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.18719: variable 'omit' from source: magic vars 30529 1726882619.19110: variable 'ansible_distribution_major_version' from source: facts 30529 1726882619.19123: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882619.19247: variable 'network_provider' from source: set_fact 30529 1726882619.19250: Evaluated conditional (network_provider == "nm"): True 30529 1726882619.19375: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882619.19439: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882619.19910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882619.24408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882619.24587: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882619.24638: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882619.24665: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882619.24807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882619.24904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882619.25129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882619.25155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882619.25197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882619.25214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882619.25259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882619.25290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882619.25309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882619.25347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882619.25360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882619.25604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882619.25629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882619.25653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882619.25690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882619.25949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882619.26038: variable 'network_connections' from source: include params 30529 1726882619.26057: variable 'interface' from source: play vars 30529 1726882619.26120: variable 'interface' from source: play vars 30529 1726882619.26191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882619.26558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882619.26604: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882619.26731: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882619.26760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882619.26805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882619.26827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882619.26850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882619.26874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882619.27230: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882619.27566: variable 'network_connections' from source: include params 30529 1726882619.27569: variable 'interface' from source: play vars 30529 1726882619.27646: variable 'interface' from source: play vars 30529 1726882619.27703: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882619.27706: when evaluation is False, skipping this task 30529 1726882619.27708: _execute() done 30529 1726882619.27711: dumping result to json 30529 1726882619.27713: done dumping result, returning 30529 1726882619.27718: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000000b40] 30529 1726882619.27733: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b40 30529 1726882619.27821: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b40 30529 1726882619.27824: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882619.27883: no more pending results, returning what we have 30529 1726882619.27887: results queue empty 30529 1726882619.27888: checking for any_errors_fatal 30529 1726882619.27919: done checking for any_errors_fatal 30529 1726882619.27920: checking for max_fail_percentage 30529 1726882619.27921: done checking for max_fail_percentage 30529 1726882619.27922: checking to see if all hosts have failed and the running result is not ok 30529 1726882619.27923: done checking to see if all hosts have failed 30529 1726882619.27924: getting the remaining hosts for this loop 30529 1726882619.27926: done getting the remaining hosts for this loop 30529 1726882619.27930: getting the next task for host managed_node1 30529 1726882619.27940: done getting next task for host managed_node1 30529 1726882619.27944: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882619.27948: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882619.27968: getting variables 30529 1726882619.27970: in VariableManager get_vars() 30529 1726882619.28121: Calling all_inventory to load vars for managed_node1 30529 1726882619.28124: Calling groups_inventory to load vars for managed_node1 30529 1726882619.28126: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882619.28135: Calling all_plugins_play to load vars for managed_node1 30529 1726882619.28138: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882619.28141: Calling groups_plugins_play to load vars for managed_node1 30529 1726882619.29010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882619.30652: done with get_vars() 30529 1726882619.30690: done getting variables 30529 1726882619.30749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:59 -0400 (0:00:00.130) 0:00:33.333 ****** 30529 1726882619.30789: entering _queue_task() for managed_node1/service 30529 1726882619.31137: worker is 1 (out of 1 available) 30529 1726882619.31150: exiting _queue_task() for managed_node1/service 30529 1726882619.31163: done queuing things up, now waiting for results queue to drain 30529 1726882619.31165: waiting for pending results... 30529 1726882619.31383: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882619.31475: in run() - task 12673a56-9f93-b0f1-edc0-000000000b41 30529 1726882619.31488: variable 'ansible_search_path' from source: unknown 30529 1726882619.31497: variable 'ansible_search_path' from source: unknown 30529 1726882619.31524: calling self._execute() 30529 1726882619.31595: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.31599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.31607: variable 'omit' from source: magic vars 30529 1726882619.31868: variable 'ansible_distribution_major_version' from source: facts 30529 1726882619.31877: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882619.31962: variable 'network_provider' from source: set_fact 30529 1726882619.31967: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882619.31970: when evaluation is False, skipping this task 30529 1726882619.31973: _execute() done 30529 1726882619.31975: dumping result to json 30529 1726882619.31977: done dumping result, returning 30529 1726882619.31985: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000000b41] 30529 1726882619.31992: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b41 30529 1726882619.32079: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b41 30529 1726882619.32081: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882619.32127: no more pending results, returning what we have 30529 1726882619.32131: results queue empty 30529 1726882619.32133: checking for any_errors_fatal 30529 1726882619.32141: done checking for any_errors_fatal 30529 1726882619.32141: checking for max_fail_percentage 30529 1726882619.32143: done checking for max_fail_percentage 30529 1726882619.32144: checking to see if all hosts have failed and the running result is not ok 30529 1726882619.32145: done checking to see if all hosts have failed 30529 1726882619.32145: getting the remaining hosts for this loop 30529 1726882619.32147: done getting the remaining hosts for this loop 30529 1726882619.32150: getting the next task for host managed_node1 30529 1726882619.32159: done getting next task for host managed_node1 30529 1726882619.32162: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882619.32167: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882619.32185: getting variables 30529 1726882619.32190: in VariableManager get_vars() 30529 1726882619.32240: Calling all_inventory to load vars for managed_node1 30529 1726882619.32243: Calling groups_inventory to load vars for managed_node1 30529 1726882619.32245: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882619.32256: Calling all_plugins_play to load vars for managed_node1 30529 1726882619.32259: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882619.32262: Calling groups_plugins_play to load vars for managed_node1 30529 1726882619.33903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882619.36104: done with get_vars() 30529 1726882619.36132: done getting variables 30529 1726882619.36342: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:59 -0400 (0:00:00.055) 0:00:33.389 ****** 30529 1726882619.36377: entering _queue_task() for managed_node1/copy 30529 1726882619.37038: worker is 1 (out of 1 available) 30529 1726882619.37197: exiting _queue_task() for managed_node1/copy 30529 1726882619.37209: done queuing things up, now waiting for results queue to drain 30529 1726882619.37211: waiting for pending results... 30529 1726882619.37619: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882619.37663: in run() - task 12673a56-9f93-b0f1-edc0-000000000b42 30529 1726882619.37683: variable 'ansible_search_path' from source: unknown 30529 1726882619.37691: variable 'ansible_search_path' from source: unknown 30529 1726882619.37743: calling self._execute() 30529 1726882619.37848: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.37859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.37929: variable 'omit' from source: magic vars 30529 1726882619.38282: variable 'ansible_distribution_major_version' from source: facts 30529 1726882619.38301: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882619.38432: variable 'network_provider' from source: set_fact 30529 1726882619.38443: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882619.38451: when evaluation is False, skipping this task 30529 1726882619.38457: _execute() done 30529 1726882619.38472: dumping result to json 30529 1726882619.38480: done dumping result, returning 30529 1726882619.38497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000000b42] 30529 1726882619.38508: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b42 30529 1726882619.38899: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b42 30529 1726882619.38902: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882619.38953: no more pending results, returning what we have 30529 1726882619.38956: results queue empty 30529 1726882619.38957: checking for any_errors_fatal 30529 1726882619.38961: done checking for any_errors_fatal 30529 1726882619.38962: checking for max_fail_percentage 30529 1726882619.38964: done checking for max_fail_percentage 30529 1726882619.38964: checking to see if all hosts have failed and the running result is not ok 30529 1726882619.38965: done checking to see if all hosts have failed 30529 1726882619.38966: getting the remaining hosts for this loop 30529 1726882619.38967: done getting the remaining hosts for this loop 30529 1726882619.38971: getting the next task for host managed_node1 30529 1726882619.38978: done getting next task for host managed_node1 30529 1726882619.38982: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882619.38986: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882619.39006: getting variables 30529 1726882619.39008: in VariableManager get_vars() 30529 1726882619.39043: Calling all_inventory to load vars for managed_node1 30529 1726882619.39046: Calling groups_inventory to load vars for managed_node1 30529 1726882619.39048: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882619.39057: Calling all_plugins_play to load vars for managed_node1 30529 1726882619.39060: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882619.39063: Calling groups_plugins_play to load vars for managed_node1 30529 1726882619.41354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882619.43065: done with get_vars() 30529 1726882619.43085: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:59 -0400 (0:00:00.067) 0:00:33.457 ****** 30529 1726882619.43152: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882619.43402: worker is 1 (out of 1 available) 30529 1726882619.43414: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882619.43427: done queuing things up, now waiting for results queue to drain 30529 1726882619.43429: waiting for pending results... 30529 1726882619.43620: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882619.43713: in run() - task 12673a56-9f93-b0f1-edc0-000000000b43 30529 1726882619.43725: variable 'ansible_search_path' from source: unknown 30529 1726882619.43728: variable 'ansible_search_path' from source: unknown 30529 1726882619.43755: calling self._execute() 30529 1726882619.43830: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.43833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.43843: variable 'omit' from source: magic vars 30529 1726882619.44132: variable 'ansible_distribution_major_version' from source: facts 30529 1726882619.44136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882619.44138: variable 'omit' from source: magic vars 30529 1726882619.44404: variable 'omit' from source: magic vars 30529 1726882619.44408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882619.46456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882619.46523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882619.46565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882619.46596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882619.46615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882619.46671: variable 'network_provider' from source: set_fact 30529 1726882619.46760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882619.46780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882619.46799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882619.46826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882619.46837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882619.46892: variable 'omit' from source: magic vars 30529 1726882619.46958: variable 'omit' from source: magic vars 30529 1726882619.47030: variable 'network_connections' from source: include params 30529 1726882619.47039: variable 'interface' from source: play vars 30529 1726882619.47081: variable 'interface' from source: play vars 30529 1726882619.47195: variable 'omit' from source: magic vars 30529 1726882619.47201: variable '__lsr_ansible_managed' from source: task vars 30529 1726882619.47242: variable '__lsr_ansible_managed' from source: task vars 30529 1726882619.47356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882619.47497: Loaded config def from plugin (lookup/template) 30529 1726882619.47501: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882619.47524: File lookup term: get_ansible_managed.j2 30529 1726882619.47527: variable 'ansible_search_path' from source: unknown 30529 1726882619.47530: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882619.47541: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882619.47554: variable 'ansible_search_path' from source: unknown 30529 1726882619.50661: variable 'ansible_managed' from source: unknown 30529 1726882619.50737: variable 'omit' from source: magic vars 30529 1726882619.50764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882619.50783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882619.50800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882619.50812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882619.50821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882619.50841: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882619.50844: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.50847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.50913: Set connection var ansible_shell_executable to /bin/sh 30529 1726882619.50916: Set connection var ansible_pipelining to False 30529 1726882619.50919: Set connection var ansible_shell_type to sh 30529 1726882619.50927: Set connection var ansible_timeout to 10 30529 1726882619.50929: Set connection var ansible_connection to ssh 30529 1726882619.50934: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882619.50950: variable 'ansible_shell_executable' from source: unknown 30529 1726882619.50953: variable 'ansible_connection' from source: unknown 30529 1726882619.50955: variable 'ansible_module_compression' from source: unknown 30529 1726882619.50957: variable 'ansible_shell_type' from source: unknown 30529 1726882619.50971: variable 'ansible_shell_executable' from source: unknown 30529 1726882619.50975: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882619.50977: variable 'ansible_pipelining' from source: unknown 30529 1726882619.50979: variable 'ansible_timeout' from source: unknown 30529 1726882619.50983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882619.51068: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882619.51079: variable 'omit' from source: magic vars 30529 1726882619.51082: starting attempt loop 30529 1726882619.51085: running the handler 30529 1726882619.51099: _low_level_execute_command(): starting 30529 1726882619.51108: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882619.51588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.51596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882619.51623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.51626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.51629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.51676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.51688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.51743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.53343: stdout chunk (state=3): >>>/root <<< 30529 1726882619.53597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.53601: stdout chunk (state=3): >>><<< 30529 1726882619.53603: stderr chunk (state=3): >>><<< 30529 1726882619.53606: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882619.53608: _low_level_execute_command(): starting 30529 1726882619.53610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726 `" && echo ansible-tmp-1726882619.535815-32122-232068603865726="` echo /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726 `" ) && sleep 0' 30529 1726882619.54506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882619.54514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882619.54532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.54538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882619.54581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882619.54584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882619.54586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.54588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.54624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.54636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882619.54649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.54716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.56633: stdout chunk (state=3): >>>ansible-tmp-1726882619.535815-32122-232068603865726=/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726 <<< 30529 1726882619.56737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.56740: stderr chunk (state=3): >>><<< 30529 1726882619.56802: stdout chunk (state=3): >>><<< 30529 1726882619.56806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882619.535815-32122-232068603865726=/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882619.56836: variable 'ansible_module_compression' from source: unknown 30529 1726882619.56885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882619.56957: variable 'ansible_facts' from source: unknown 30529 1726882619.57125: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py 30529 1726882619.57232: Sending initial data 30529 1726882619.57236: Sent initial data (167 bytes) 30529 1726882619.57665: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882619.57673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882619.57703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.57706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.57714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.57751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.57765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.57817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.59318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882619.59325: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882619.59357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882619.59398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp7voqy_x9 /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py <<< 30529 1726882619.59406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py" <<< 30529 1726882619.59437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp7voqy_x9" to remote "/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py" <<< 30529 1726882619.59444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py" <<< 30529 1726882619.60429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.60432: stdout chunk (state=3): >>><<< 30529 1726882619.60434: stderr chunk (state=3): >>><<< 30529 1726882619.60457: done transferring module to remote 30529 1726882619.60471: _low_level_execute_command(): starting 30529 1726882619.60479: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/ /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py && sleep 0' 30529 1726882619.61109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882619.61191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882619.61209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.61243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.61256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882619.61274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.61429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.63171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.63181: stdout chunk (state=3): >>><<< 30529 1726882619.63195: stderr chunk (state=3): >>><<< 30529 1726882619.63332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882619.63341: _low_level_execute_command(): starting 30529 1726882619.63418: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/AnsiballZ_network_connections.py && sleep 0' 30529 1726882619.63932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882619.63947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882619.63961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.63982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882619.64002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882619.64015: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882619.64112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.64136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882619.64151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.64228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.91337: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882619.93786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882619.93813: stderr chunk (state=3): >>><<< 30529 1726882619.93817: stdout chunk (state=3): >>><<< 30529 1726882619.93832: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882619.93862: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882619.93870: _low_level_execute_command(): starting 30529 1726882619.93875: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882619.535815-32122-232068603865726/ > /dev/null 2>&1 && sleep 0' 30529 1726882619.94297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.94301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882619.94328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.94331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882619.94334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882619.94336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882619.94387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882619.94401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882619.94442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882619.96969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882619.97001: stderr chunk (state=3): >>><<< 30529 1726882619.97004: stdout chunk (state=3): >>><<< 30529 1726882619.97027: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882619.97114: handler run complete 30529 1726882619.97116: attempt loop complete, returning result 30529 1726882619.97119: _execute() done 30529 1726882619.97121: dumping result to json 30529 1726882619.97123: done dumping result, returning 30529 1726882619.97131: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000000b43] 30529 1726882619.97133: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b43 30529 1726882619.97260: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b43 30529 1726882619.97263: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 30529 1726882619.97455: no more pending results, returning what we have 30529 1726882619.97459: results queue empty 30529 1726882619.97514: checking for any_errors_fatal 30529 1726882619.97556: done checking for any_errors_fatal 30529 1726882619.97557: checking for max_fail_percentage 30529 1726882619.97559: done checking for max_fail_percentage 30529 1726882619.97560: checking to see if all hosts have failed and the running result is not ok 30529 1726882619.97560: done checking to see if all hosts have failed 30529 1726882619.97561: getting the remaining hosts for this loop 30529 1726882619.97563: done getting the remaining hosts for this loop 30529 1726882619.97566: getting the next task for host managed_node1 30529 1726882619.97573: done getting next task for host managed_node1 30529 1726882619.97577: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882619.97584: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882619.97607: getting variables 30529 1726882619.97609: in VariableManager get_vars() 30529 1726882619.97643: Calling all_inventory to load vars for managed_node1 30529 1726882619.97646: Calling groups_inventory to load vars for managed_node1 30529 1726882619.97647: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882619.97654: Calling all_plugins_play to load vars for managed_node1 30529 1726882619.97656: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882619.97658: Calling groups_plugins_play to load vars for managed_node1 30529 1726882619.99045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.00224: done with get_vars() 30529 1726882620.00240: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:00 -0400 (0:00:00.571) 0:00:34.029 ****** 30529 1726882620.00323: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882620.00698: worker is 1 (out of 1 available) 30529 1726882620.00714: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882620.00730: done queuing things up, now waiting for results queue to drain 30529 1726882620.00732: waiting for pending results... 30529 1726882620.00982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882620.01151: in run() - task 12673a56-9f93-b0f1-edc0-000000000b44 30529 1726882620.01159: variable 'ansible_search_path' from source: unknown 30529 1726882620.01164: variable 'ansible_search_path' from source: unknown 30529 1726882620.01258: calling self._execute() 30529 1726882620.01308: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.01311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.01321: variable 'omit' from source: magic vars 30529 1726882620.01668: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.01672: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.01782: variable 'network_state' from source: role '' defaults 30529 1726882620.01796: Evaluated conditional (network_state != {}): False 30529 1726882620.01799: when evaluation is False, skipping this task 30529 1726882620.01802: _execute() done 30529 1726882620.01804: dumping result to json 30529 1726882620.01807: done dumping result, returning 30529 1726882620.01812: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000000b44] 30529 1726882620.01817: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b44 30529 1726882620.01912: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b44 30529 1726882620.01915: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882620.01972: no more pending results, returning what we have 30529 1726882620.01977: results queue empty 30529 1726882620.01978: checking for any_errors_fatal 30529 1726882620.01988: done checking for any_errors_fatal 30529 1726882620.01991: checking for max_fail_percentage 30529 1726882620.02011: done checking for max_fail_percentage 30529 1726882620.02013: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.02013: done checking to see if all hosts have failed 30529 1726882620.02014: getting the remaining hosts for this loop 30529 1726882620.02017: done getting the remaining hosts for this loop 30529 1726882620.02021: getting the next task for host managed_node1 30529 1726882620.02030: done getting next task for host managed_node1 30529 1726882620.02033: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882620.02037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.02056: getting variables 30529 1726882620.02057: in VariableManager get_vars() 30529 1726882620.02092: Calling all_inventory to load vars for managed_node1 30529 1726882620.02123: Calling groups_inventory to load vars for managed_node1 30529 1726882620.02126: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.02134: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.02137: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.02139: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.02913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.03869: done with get_vars() 30529 1726882620.03901: done getting variables 30529 1726882620.03976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:00 -0400 (0:00:00.036) 0:00:34.066 ****** 30529 1726882620.04013: entering _queue_task() for managed_node1/debug 30529 1726882620.04332: worker is 1 (out of 1 available) 30529 1726882620.04348: exiting _queue_task() for managed_node1/debug 30529 1726882620.04365: done queuing things up, now waiting for results queue to drain 30529 1726882620.04367: waiting for pending results... 30529 1726882620.04624: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882620.04720: in run() - task 12673a56-9f93-b0f1-edc0-000000000b45 30529 1726882620.04733: variable 'ansible_search_path' from source: unknown 30529 1726882620.04737: variable 'ansible_search_path' from source: unknown 30529 1726882620.04767: calling self._execute() 30529 1726882620.04841: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.04845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.04860: variable 'omit' from source: magic vars 30529 1726882620.05190: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.05205: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.05208: variable 'omit' from source: magic vars 30529 1726882620.05257: variable 'omit' from source: magic vars 30529 1726882620.05279: variable 'omit' from source: magic vars 30529 1726882620.05315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882620.05344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882620.05359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882620.05373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.05384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.05480: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882620.05484: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.05487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.05528: Set connection var ansible_shell_executable to /bin/sh 30529 1726882620.05531: Set connection var ansible_pipelining to False 30529 1726882620.05534: Set connection var ansible_shell_type to sh 30529 1726882620.05572: Set connection var ansible_timeout to 10 30529 1726882620.05575: Set connection var ansible_connection to ssh 30529 1726882620.05577: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882620.05604: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.05608: variable 'ansible_connection' from source: unknown 30529 1726882620.05610: variable 'ansible_module_compression' from source: unknown 30529 1726882620.05612: variable 'ansible_shell_type' from source: unknown 30529 1726882620.05799: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.05802: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.05804: variable 'ansible_pipelining' from source: unknown 30529 1726882620.05806: variable 'ansible_timeout' from source: unknown 30529 1726882620.05808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.05811: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882620.05814: variable 'omit' from source: magic vars 30529 1726882620.05816: starting attempt loop 30529 1726882620.05817: running the handler 30529 1726882620.05989: variable '__network_connections_result' from source: set_fact 30529 1726882620.06056: handler run complete 30529 1726882620.06080: attempt loop complete, returning result 30529 1726882620.06087: _execute() done 30529 1726882620.06095: dumping result to json 30529 1726882620.06103: done dumping result, returning 30529 1726882620.06116: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000b45] 30529 1726882620.06125: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b45 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616" ] } 30529 1726882620.06374: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b45 30529 1726882620.06378: WORKER PROCESS EXITING 30529 1726882620.06390: no more pending results, returning what we have 30529 1726882620.06465: results queue empty 30529 1726882620.06467: checking for any_errors_fatal 30529 1726882620.06477: done checking for any_errors_fatal 30529 1726882620.06478: checking for max_fail_percentage 30529 1726882620.06480: done checking for max_fail_percentage 30529 1726882620.06481: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.06482: done checking to see if all hosts have failed 30529 1726882620.06482: getting the remaining hosts for this loop 30529 1726882620.06484: done getting the remaining hosts for this loop 30529 1726882620.06488: getting the next task for host managed_node1 30529 1726882620.06499: done getting next task for host managed_node1 30529 1726882620.06503: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882620.06508: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.06522: getting variables 30529 1726882620.06524: in VariableManager get_vars() 30529 1726882620.06560: Calling all_inventory to load vars for managed_node1 30529 1726882620.06563: Calling groups_inventory to load vars for managed_node1 30529 1726882620.06565: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.06634: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.06638: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.06641: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.07783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.08903: done with get_vars() 30529 1726882620.08917: done getting variables 30529 1726882620.08972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:00 -0400 (0:00:00.049) 0:00:34.116 ****** 30529 1726882620.09004: entering _queue_task() for managed_node1/debug 30529 1726882620.09236: worker is 1 (out of 1 available) 30529 1726882620.09251: exiting _queue_task() for managed_node1/debug 30529 1726882620.09266: done queuing things up, now waiting for results queue to drain 30529 1726882620.09268: waiting for pending results... 30529 1726882620.09608: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882620.09673: in run() - task 12673a56-9f93-b0f1-edc0-000000000b46 30529 1726882620.09688: variable 'ansible_search_path' from source: unknown 30529 1726882620.09697: variable 'ansible_search_path' from source: unknown 30529 1726882620.09719: calling self._execute() 30529 1726882620.09803: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.09810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.09846: variable 'omit' from source: magic vars 30529 1726882620.10186: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.10199: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.10205: variable 'omit' from source: magic vars 30529 1726882620.10266: variable 'omit' from source: magic vars 30529 1726882620.10287: variable 'omit' from source: magic vars 30529 1726882620.10328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882620.10366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882620.10380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882620.10395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.10408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.10450: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882620.10457: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.10459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.10597: Set connection var ansible_shell_executable to /bin/sh 30529 1726882620.10601: Set connection var ansible_pipelining to False 30529 1726882620.10605: Set connection var ansible_shell_type to sh 30529 1726882620.10607: Set connection var ansible_timeout to 10 30529 1726882620.10609: Set connection var ansible_connection to ssh 30529 1726882620.10611: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882620.10656: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.10659: variable 'ansible_connection' from source: unknown 30529 1726882620.10662: variable 'ansible_module_compression' from source: unknown 30529 1726882620.10664: variable 'ansible_shell_type' from source: unknown 30529 1726882620.10666: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.10668: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.10670: variable 'ansible_pipelining' from source: unknown 30529 1726882620.10672: variable 'ansible_timeout' from source: unknown 30529 1726882620.10673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.10809: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882620.10842: variable 'omit' from source: magic vars 30529 1726882620.10845: starting attempt loop 30529 1726882620.10847: running the handler 30529 1726882620.10884: variable '__network_connections_result' from source: set_fact 30529 1726882620.10940: variable '__network_connections_result' from source: set_fact 30529 1726882620.11050: handler run complete 30529 1726882620.11083: attempt loop complete, returning result 30529 1726882620.11087: _execute() done 30529 1726882620.11096: dumping result to json 30529 1726882620.11100: done dumping result, returning 30529 1726882620.11103: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000b46] 30529 1726882620.11105: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b46 30529 1726882620.11206: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b46 30529 1726882620.11209: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616" ] } } 30529 1726882620.11309: no more pending results, returning what we have 30529 1726882620.11312: results queue empty 30529 1726882620.11313: checking for any_errors_fatal 30529 1726882620.11319: done checking for any_errors_fatal 30529 1726882620.11320: checking for max_fail_percentage 30529 1726882620.11321: done checking for max_fail_percentage 30529 1726882620.11322: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.11323: done checking to see if all hosts have failed 30529 1726882620.11324: getting the remaining hosts for this loop 30529 1726882620.11325: done getting the remaining hosts for this loop 30529 1726882620.11328: getting the next task for host managed_node1 30529 1726882620.11335: done getting next task for host managed_node1 30529 1726882620.11338: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882620.11341: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.11353: getting variables 30529 1726882620.11355: in VariableManager get_vars() 30529 1726882620.11425: Calling all_inventory to load vars for managed_node1 30529 1726882620.11428: Calling groups_inventory to load vars for managed_node1 30529 1726882620.11430: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.11439: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.11441: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.11443: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.12279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.13726: done with get_vars() 30529 1726882620.13749: done getting variables 30529 1726882620.13811: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:00 -0400 (0:00:00.048) 0:00:34.164 ****** 30529 1726882620.13844: entering _queue_task() for managed_node1/debug 30529 1726882620.14137: worker is 1 (out of 1 available) 30529 1726882620.14151: exiting _queue_task() for managed_node1/debug 30529 1726882620.14163: done queuing things up, now waiting for results queue to drain 30529 1726882620.14165: waiting for pending results... 30529 1726882620.14379: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882620.14485: in run() - task 12673a56-9f93-b0f1-edc0-000000000b47 30529 1726882620.14522: variable 'ansible_search_path' from source: unknown 30529 1726882620.14529: variable 'ansible_search_path' from source: unknown 30529 1726882620.14566: calling self._execute() 30529 1726882620.14632: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.14638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.14648: variable 'omit' from source: magic vars 30529 1726882620.15024: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.15028: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.15336: variable 'network_state' from source: role '' defaults 30529 1726882620.15340: Evaluated conditional (network_state != {}): False 30529 1726882620.15343: when evaluation is False, skipping this task 30529 1726882620.15345: _execute() done 30529 1726882620.15348: dumping result to json 30529 1726882620.15350: done dumping result, returning 30529 1726882620.15353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000000b47] 30529 1726882620.15355: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b47 30529 1726882620.15646: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b47 30529 1726882620.15650: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882620.15695: no more pending results, returning what we have 30529 1726882620.15699: results queue empty 30529 1726882620.15700: checking for any_errors_fatal 30529 1726882620.15706: done checking for any_errors_fatal 30529 1726882620.15707: checking for max_fail_percentage 30529 1726882620.15708: done checking for max_fail_percentage 30529 1726882620.15709: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.15710: done checking to see if all hosts have failed 30529 1726882620.15711: getting the remaining hosts for this loop 30529 1726882620.15712: done getting the remaining hosts for this loop 30529 1726882620.15716: getting the next task for host managed_node1 30529 1726882620.15723: done getting next task for host managed_node1 30529 1726882620.15726: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882620.15731: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.15749: getting variables 30529 1726882620.15751: in VariableManager get_vars() 30529 1726882620.15787: Calling all_inventory to load vars for managed_node1 30529 1726882620.15789: Calling groups_inventory to load vars for managed_node1 30529 1726882620.15792: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.15805: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.15809: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.15812: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.18075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.20141: done with get_vars() 30529 1726882620.20166: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:00 -0400 (0:00:00.064) 0:00:34.229 ****** 30529 1726882620.20338: entering _queue_task() for managed_node1/ping 30529 1726882620.20991: worker is 1 (out of 1 available) 30529 1726882620.21004: exiting _queue_task() for managed_node1/ping 30529 1726882620.21015: done queuing things up, now waiting for results queue to drain 30529 1726882620.21017: waiting for pending results... 30529 1726882620.21229: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882620.21440: in run() - task 12673a56-9f93-b0f1-edc0-000000000b48 30529 1726882620.21633: variable 'ansible_search_path' from source: unknown 30529 1726882620.21642: variable 'ansible_search_path' from source: unknown 30529 1726882620.21753: calling self._execute() 30529 1726882620.21777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.21781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.21796: variable 'omit' from source: magic vars 30529 1726882620.22799: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.22802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.22805: variable 'omit' from source: magic vars 30529 1726882620.22927: variable 'omit' from source: magic vars 30529 1726882620.22982: variable 'omit' from source: magic vars 30529 1726882620.23056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882620.23098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882620.23123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882620.23139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.23152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.23299: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882620.23302: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.23304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.23659: Set connection var ansible_shell_executable to /bin/sh 30529 1726882620.23709: Set connection var ansible_pipelining to False 30529 1726882620.23712: Set connection var ansible_shell_type to sh 30529 1726882620.23714: Set connection var ansible_timeout to 10 30529 1726882620.23716: Set connection var ansible_connection to ssh 30529 1726882620.23719: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882620.23797: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.23801: variable 'ansible_connection' from source: unknown 30529 1726882620.23804: variable 'ansible_module_compression' from source: unknown 30529 1726882620.23806: variable 'ansible_shell_type' from source: unknown 30529 1726882620.23808: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.23810: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.24003: variable 'ansible_pipelining' from source: unknown 30529 1726882620.24006: variable 'ansible_timeout' from source: unknown 30529 1726882620.24008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.24608: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882620.24619: variable 'omit' from source: magic vars 30529 1726882620.24624: starting attempt loop 30529 1726882620.24627: running the handler 30529 1726882620.24645: _low_level_execute_command(): starting 30529 1726882620.24770: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882620.26333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882620.26368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882620.26484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.26540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.26565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.28235: stdout chunk (state=3): >>>/root <<< 30529 1726882620.28355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882620.28391: stderr chunk (state=3): >>><<< 30529 1726882620.28408: stdout chunk (state=3): >>><<< 30529 1726882620.28437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882620.28457: _low_level_execute_command(): starting 30529 1726882620.28469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904 `" && echo ansible-tmp-1726882620.2844396-32169-203947867312904="` echo /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904 `" ) && sleep 0' 30529 1726882620.29057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882620.29072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882620.29091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882620.29112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882620.29131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882620.29144: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882620.29240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.29303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.29340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.31318: stdout chunk (state=3): >>>ansible-tmp-1726882620.2844396-32169-203947867312904=/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904 <<< 30529 1726882620.31380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882620.31417: stderr chunk (state=3): >>><<< 30529 1726882620.31420: stdout chunk (state=3): >>><<< 30529 1726882620.31439: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882620.2844396-32169-203947867312904=/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882620.31520: variable 'ansible_module_compression' from source: unknown 30529 1726882620.31580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882620.31625: variable 'ansible_facts' from source: unknown 30529 1726882620.31715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py 30529 1726882620.31923: Sending initial data 30529 1726882620.31926: Sent initial data (153 bytes) 30529 1726882620.32517: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882620.32530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882620.32634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.33019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.33065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.34729: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882620.34786: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882620.34841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqo1sj4pi /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py <<< 30529 1726882620.34846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py" <<< 30529 1726882620.34879: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqo1sj4pi" to remote "/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py" <<< 30529 1726882620.36563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882620.36567: stdout chunk (state=3): >>><<< 30529 1726882620.36569: stderr chunk (state=3): >>><<< 30529 1726882620.36572: done transferring module to remote 30529 1726882620.36574: _low_level_execute_command(): starting 30529 1726882620.36576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/ /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py && sleep 0' 30529 1726882620.38006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882620.38355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882620.38366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.38466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.38531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.41124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882620.41128: stdout chunk (state=3): >>><<< 30529 1726882620.41130: stderr chunk (state=3): >>><<< 30529 1726882620.41134: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882620.41141: _low_level_execute_command(): starting 30529 1726882620.41144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/AnsiballZ_ping.py && sleep 0' 30529 1726882620.42340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882620.42431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882620.42472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882620.42530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.42551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.42653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.57534: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882620.58670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882620.58674: stdout chunk (state=3): >>><<< 30529 1726882620.58677: stderr chunk (state=3): >>><<< 30529 1726882620.58705: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882620.58831: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882620.58840: _low_level_execute_command(): starting 30529 1726882620.58845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882620.2844396-32169-203947867312904/ > /dev/null 2>&1 && sleep 0' 30529 1726882620.60296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882620.60408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882620.60613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882620.60748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882620.60821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882620.62853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882620.62857: stdout chunk (state=3): >>><<< 30529 1726882620.62859: stderr chunk (state=3): >>><<< 30529 1726882620.62861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882620.62864: handler run complete 30529 1726882620.62866: attempt loop complete, returning result 30529 1726882620.62868: _execute() done 30529 1726882620.62870: dumping result to json 30529 1726882620.62872: done dumping result, returning 30529 1726882620.62873: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000000b48] 30529 1726882620.62875: sending task result for task 12673a56-9f93-b0f1-edc0-000000000b48 30529 1726882620.63200: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000b48 ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882620.63276: no more pending results, returning what we have 30529 1726882620.63280: results queue empty 30529 1726882620.63281: checking for any_errors_fatal 30529 1726882620.63291: WORKER PROCESS EXITING 30529 1726882620.63503: done checking for any_errors_fatal 30529 1726882620.63508: checking for max_fail_percentage 30529 1726882620.63510: done checking for max_fail_percentage 30529 1726882620.63512: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.63512: done checking to see if all hosts have failed 30529 1726882620.63513: getting the remaining hosts for this loop 30529 1726882620.63515: done getting the remaining hosts for this loop 30529 1726882620.63518: getting the next task for host managed_node1 30529 1726882620.63530: done getting next task for host managed_node1 30529 1726882620.63532: ^ task is: TASK: meta (role_complete) 30529 1726882620.63536: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.63549: getting variables 30529 1726882620.63552: in VariableManager get_vars() 30529 1726882620.63589: Calling all_inventory to load vars for managed_node1 30529 1726882620.63592: Calling groups_inventory to load vars for managed_node1 30529 1726882620.63596: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.63605: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.63608: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.63610: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.75120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.77115: done with get_vars() 30529 1726882620.77146: done getting variables 30529 1726882620.77227: done queuing things up, now waiting for results queue to drain 30529 1726882620.77229: results queue empty 30529 1726882620.77230: checking for any_errors_fatal 30529 1726882620.77233: done checking for any_errors_fatal 30529 1726882620.77234: checking for max_fail_percentage 30529 1726882620.77235: done checking for max_fail_percentage 30529 1726882620.77235: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.77236: done checking to see if all hosts have failed 30529 1726882620.77237: getting the remaining hosts for this loop 30529 1726882620.77238: done getting the remaining hosts for this loop 30529 1726882620.77240: getting the next task for host managed_node1 30529 1726882620.77245: done getting next task for host managed_node1 30529 1726882620.77247: ^ task is: TASK: Show result 30529 1726882620.77249: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.77251: getting variables 30529 1726882620.77252: in VariableManager get_vars() 30529 1726882620.77264: Calling all_inventory to load vars for managed_node1 30529 1726882620.77266: Calling groups_inventory to load vars for managed_node1 30529 1726882620.77268: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.77273: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.77275: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.77278: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.78767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.81192: done with get_vars() 30529 1726882620.81222: done getting variables 30529 1726882620.81265: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:37:00 -0400 (0:00:00.609) 0:00:34.839 ****** 30529 1726882620.81296: entering _queue_task() for managed_node1/debug 30529 1726882620.81728: worker is 1 (out of 1 available) 30529 1726882620.81744: exiting _queue_task() for managed_node1/debug 30529 1726882620.81756: done queuing things up, now waiting for results queue to drain 30529 1726882620.81758: waiting for pending results... 30529 1726882620.82011: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882620.82125: in run() - task 12673a56-9f93-b0f1-edc0-000000000ad2 30529 1726882620.82143: variable 'ansible_search_path' from source: unknown 30529 1726882620.82146: variable 'ansible_search_path' from source: unknown 30529 1726882620.82186: calling self._execute() 30529 1726882620.82306: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.82313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.82324: variable 'omit' from source: magic vars 30529 1726882620.82787: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.82809: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.82916: variable 'omit' from source: magic vars 30529 1726882620.82919: variable 'omit' from source: magic vars 30529 1726882620.82927: variable 'omit' from source: magic vars 30529 1726882620.82973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882620.83025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882620.83050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882620.83094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.83126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882620.83307: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882620.83311: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.83315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.83338: Set connection var ansible_shell_executable to /bin/sh 30529 1726882620.83342: Set connection var ansible_pipelining to False 30529 1726882620.83344: Set connection var ansible_shell_type to sh 30529 1726882620.83347: Set connection var ansible_timeout to 10 30529 1726882620.83349: Set connection var ansible_connection to ssh 30529 1726882620.83351: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882620.83372: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.83375: variable 'ansible_connection' from source: unknown 30529 1726882620.83378: variable 'ansible_module_compression' from source: unknown 30529 1726882620.83380: variable 'ansible_shell_type' from source: unknown 30529 1726882620.83384: variable 'ansible_shell_executable' from source: unknown 30529 1726882620.83398: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.83422: variable 'ansible_pipelining' from source: unknown 30529 1726882620.83425: variable 'ansible_timeout' from source: unknown 30529 1726882620.83427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.83570: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882620.83580: variable 'omit' from source: magic vars 30529 1726882620.83585: starting attempt loop 30529 1726882620.83588: running the handler 30529 1726882620.83646: variable '__network_connections_result' from source: set_fact 30529 1726882620.83724: variable '__network_connections_result' from source: set_fact 30529 1726882620.83926: handler run complete 30529 1726882620.83929: attempt loop complete, returning result 30529 1726882620.83937: _execute() done 30529 1726882620.83940: dumping result to json 30529 1726882620.83979: done dumping result, returning 30529 1726882620.83983: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-000000000ad2] 30529 1726882620.83985: sending task result for task 12673a56-9f93-b0f1-edc0-000000000ad2 30529 1726882620.84150: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000ad2 30529 1726882620.84156: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616" ] } } 30529 1726882620.84242: no more pending results, returning what we have 30529 1726882620.84246: results queue empty 30529 1726882620.84247: checking for any_errors_fatal 30529 1726882620.84249: done checking for any_errors_fatal 30529 1726882620.84250: checking for max_fail_percentage 30529 1726882620.84252: done checking for max_fail_percentage 30529 1726882620.84253: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.84254: done checking to see if all hosts have failed 30529 1726882620.84255: getting the remaining hosts for this loop 30529 1726882620.84257: done getting the remaining hosts for this loop 30529 1726882620.84261: getting the next task for host managed_node1 30529 1726882620.84271: done getting next task for host managed_node1 30529 1726882620.84275: ^ task is: TASK: Test 30529 1726882620.84278: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.84284: getting variables 30529 1726882620.84287: in VariableManager get_vars() 30529 1726882620.84323: Calling all_inventory to load vars for managed_node1 30529 1726882620.84326: Calling groups_inventory to load vars for managed_node1 30529 1726882620.84330: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.84345: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.84348: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.84352: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.86526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.88281: done with get_vars() 30529 1726882620.88304: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:37:00 -0400 (0:00:00.071) 0:00:34.910 ****** 30529 1726882620.88418: entering _queue_task() for managed_node1/include_tasks 30529 1726882620.88915: worker is 1 (out of 1 available) 30529 1726882620.88928: exiting _queue_task() for managed_node1/include_tasks 30529 1726882620.88942: done queuing things up, now waiting for results queue to drain 30529 1726882620.88944: waiting for pending results... 30529 1726882620.89241: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882620.89338: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4d 30529 1726882620.89342: variable 'ansible_search_path' from source: unknown 30529 1726882620.89344: variable 'ansible_search_path' from source: unknown 30529 1726882620.89347: variable 'lsr_test' from source: include params 30529 1726882620.89566: variable 'lsr_test' from source: include params 30529 1726882620.89632: variable 'omit' from source: magic vars 30529 1726882620.89760: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882620.89769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882620.89778: variable 'omit' from source: magic vars 30529 1726882620.90012: variable 'ansible_distribution_major_version' from source: facts 30529 1726882620.90022: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882620.90029: variable 'item' from source: unknown 30529 1726882620.90098: variable 'item' from source: unknown 30529 1726882620.90127: variable 'item' from source: unknown 30529 1726882620.90182: variable 'item' from source: unknown 30529 1726882620.90321: dumping result to json 30529 1726882620.90325: done dumping result, returning 30529 1726882620.90328: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-000000000a4d] 30529 1726882620.90330: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4d 30529 1726882620.90366: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4d 30529 1726882620.90370: WORKER PROCESS EXITING 30529 1726882620.90395: no more pending results, returning what we have 30529 1726882620.90400: in VariableManager get_vars() 30529 1726882620.90440: Calling all_inventory to load vars for managed_node1 30529 1726882620.90443: Calling groups_inventory to load vars for managed_node1 30529 1726882620.90446: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.90461: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.90465: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.90467: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.91972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.94359: done with get_vars() 30529 1726882620.94412: variable 'ansible_search_path' from source: unknown 30529 1726882620.94414: variable 'ansible_search_path' from source: unknown 30529 1726882620.94522: we have included files to process 30529 1726882620.94523: generating all_blocks data 30529 1726882620.94525: done generating all_blocks data 30529 1726882620.94530: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882620.94531: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882620.94533: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882620.94727: done processing included file 30529 1726882620.94729: iterating over new_blocks loaded from include file 30529 1726882620.94731: in VariableManager get_vars() 30529 1726882620.94747: done with get_vars() 30529 1726882620.94749: filtering new block on tags 30529 1726882620.94776: done filtering new block on tags 30529 1726882620.94779: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node1 => (item=tasks/activate_profile.yml) 30529 1726882620.94783: extending task lists for all hosts with included blocks 30529 1726882620.96103: done extending task lists 30529 1726882620.96104: done processing included files 30529 1726882620.96105: results queue empty 30529 1726882620.96106: checking for any_errors_fatal 30529 1726882620.96111: done checking for any_errors_fatal 30529 1726882620.96111: checking for max_fail_percentage 30529 1726882620.96113: done checking for max_fail_percentage 30529 1726882620.96113: checking to see if all hosts have failed and the running result is not ok 30529 1726882620.96114: done checking to see if all hosts have failed 30529 1726882620.96115: getting the remaining hosts for this loop 30529 1726882620.96116: done getting the remaining hosts for this loop 30529 1726882620.96119: getting the next task for host managed_node1 30529 1726882620.96128: done getting next task for host managed_node1 30529 1726882620.96131: ^ task is: TASK: Include network role 30529 1726882620.96134: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882620.96137: getting variables 30529 1726882620.96138: in VariableManager get_vars() 30529 1726882620.96148: Calling all_inventory to load vars for managed_node1 30529 1726882620.96150: Calling groups_inventory to load vars for managed_node1 30529 1726882620.96153: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882620.96158: Calling all_plugins_play to load vars for managed_node1 30529 1726882620.96161: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882620.96163: Calling groups_plugins_play to load vars for managed_node1 30529 1726882620.97506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882620.99320: done with get_vars() 30529 1726882620.99346: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:37:00 -0400 (0:00:00.110) 0:00:35.020 ****** 30529 1726882620.99447: entering _queue_task() for managed_node1/include_role 30529 1726882621.00096: worker is 1 (out of 1 available) 30529 1726882621.00105: exiting _queue_task() for managed_node1/include_role 30529 1726882621.00114: done queuing things up, now waiting for results queue to drain 30529 1726882621.00116: waiting for pending results... 30529 1726882621.00427: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882621.00433: in run() - task 12673a56-9f93-b0f1-edc0-000000000caa 30529 1726882621.00435: variable 'ansible_search_path' from source: unknown 30529 1726882621.00441: variable 'ansible_search_path' from source: unknown 30529 1726882621.00446: calling self._execute() 30529 1726882621.00628: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.00632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.00643: variable 'omit' from source: magic vars 30529 1726882621.01130: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.01143: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.01149: _execute() done 30529 1726882621.01152: dumping result to json 30529 1726882621.01157: done dumping result, returning 30529 1726882621.01164: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000000caa] 30529 1726882621.01171: sending task result for task 12673a56-9f93-b0f1-edc0-000000000caa 30529 1726882621.01396: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000caa 30529 1726882621.01400: WORKER PROCESS EXITING 30529 1726882621.01461: no more pending results, returning what we have 30529 1726882621.01466: in VariableManager get_vars() 30529 1726882621.01504: Calling all_inventory to load vars for managed_node1 30529 1726882621.01507: Calling groups_inventory to load vars for managed_node1 30529 1726882621.01511: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.01524: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.01527: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.01531: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.03685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.06143: done with get_vars() 30529 1726882621.06170: variable 'ansible_search_path' from source: unknown 30529 1726882621.06172: variable 'ansible_search_path' from source: unknown 30529 1726882621.06411: variable 'omit' from source: magic vars 30529 1726882621.06521: variable 'omit' from source: magic vars 30529 1726882621.06538: variable 'omit' from source: magic vars 30529 1726882621.06542: we have included files to process 30529 1726882621.06542: generating all_blocks data 30529 1726882621.06544: done generating all_blocks data 30529 1726882621.06545: processing included file: fedora.linux_system_roles.network 30529 1726882621.06569: in VariableManager get_vars() 30529 1726882621.06584: done with get_vars() 30529 1726882621.06618: in VariableManager get_vars() 30529 1726882621.06637: done with get_vars() 30529 1726882621.06678: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882621.06811: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882621.06945: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882621.08085: in VariableManager get_vars() 30529 1726882621.08114: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882621.10535: iterating over new_blocks loaded from include file 30529 1726882621.10537: in VariableManager get_vars() 30529 1726882621.10556: done with get_vars() 30529 1726882621.10559: filtering new block on tags 30529 1726882621.10880: done filtering new block on tags 30529 1726882621.10884: in VariableManager get_vars() 30529 1726882621.10905: done with get_vars() 30529 1726882621.10907: filtering new block on tags 30529 1726882621.10929: done filtering new block on tags 30529 1726882621.10931: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882621.10937: extending task lists for all hosts with included blocks 30529 1726882621.11058: done extending task lists 30529 1726882621.11059: done processing included files 30529 1726882621.11060: results queue empty 30529 1726882621.11061: checking for any_errors_fatal 30529 1726882621.11064: done checking for any_errors_fatal 30529 1726882621.11065: checking for max_fail_percentage 30529 1726882621.11066: done checking for max_fail_percentage 30529 1726882621.11067: checking to see if all hosts have failed and the running result is not ok 30529 1726882621.11068: done checking to see if all hosts have failed 30529 1726882621.11068: getting the remaining hosts for this loop 30529 1726882621.11070: done getting the remaining hosts for this loop 30529 1726882621.11072: getting the next task for host managed_node1 30529 1726882621.11077: done getting next task for host managed_node1 30529 1726882621.11079: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882621.11082: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882621.11097: getting variables 30529 1726882621.11098: in VariableManager get_vars() 30529 1726882621.11110: Calling all_inventory to load vars for managed_node1 30529 1726882621.11113: Calling groups_inventory to load vars for managed_node1 30529 1726882621.11115: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.11121: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.11123: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.11126: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.13695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.15933: done with get_vars() 30529 1726882621.15958: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:01 -0400 (0:00:00.165) 0:00:35.186 ****** 30529 1726882621.16047: entering _queue_task() for managed_node1/include_tasks 30529 1726882621.16440: worker is 1 (out of 1 available) 30529 1726882621.16453: exiting _queue_task() for managed_node1/include_tasks 30529 1726882621.16468: done queuing things up, now waiting for results queue to drain 30529 1726882621.16470: waiting for pending results... 30529 1726882621.16769: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882621.16970: in run() - task 12673a56-9f93-b0f1-edc0-000000000d16 30529 1726882621.16974: variable 'ansible_search_path' from source: unknown 30529 1726882621.16976: variable 'ansible_search_path' from source: unknown 30529 1726882621.16979: calling self._execute() 30529 1726882621.17056: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.17061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.17071: variable 'omit' from source: magic vars 30529 1726882621.17481: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.17485: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.17488: _execute() done 30529 1726882621.17491: dumping result to json 30529 1726882621.17577: done dumping result, returning 30529 1726882621.17580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000000d16] 30529 1726882621.17582: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d16 30529 1726882621.17646: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d16 30529 1726882621.17649: WORKER PROCESS EXITING 30529 1726882621.17727: no more pending results, returning what we have 30529 1726882621.17734: in VariableManager get_vars() 30529 1726882621.17778: Calling all_inventory to load vars for managed_node1 30529 1726882621.17782: Calling groups_inventory to load vars for managed_node1 30529 1726882621.17784: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.17802: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.17806: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.17809: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.19356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.22021: done with get_vars() 30529 1726882621.22052: variable 'ansible_search_path' from source: unknown 30529 1726882621.22054: variable 'ansible_search_path' from source: unknown 30529 1726882621.22100: we have included files to process 30529 1726882621.22102: generating all_blocks data 30529 1726882621.22104: done generating all_blocks data 30529 1726882621.22108: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882621.22109: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882621.22112: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882621.22738: done processing included file 30529 1726882621.22740: iterating over new_blocks loaded from include file 30529 1726882621.22741: in VariableManager get_vars() 30529 1726882621.22767: done with get_vars() 30529 1726882621.22769: filtering new block on tags 30529 1726882621.22804: done filtering new block on tags 30529 1726882621.22808: in VariableManager get_vars() 30529 1726882621.22832: done with get_vars() 30529 1726882621.22834: filtering new block on tags 30529 1726882621.22879: done filtering new block on tags 30529 1726882621.22882: in VariableManager get_vars() 30529 1726882621.22912: done with get_vars() 30529 1726882621.22914: filtering new block on tags 30529 1726882621.22956: done filtering new block on tags 30529 1726882621.22959: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882621.22964: extending task lists for all hosts with included blocks 30529 1726882621.25187: done extending task lists 30529 1726882621.25189: done processing included files 30529 1726882621.25190: results queue empty 30529 1726882621.25191: checking for any_errors_fatal 30529 1726882621.25196: done checking for any_errors_fatal 30529 1726882621.25197: checking for max_fail_percentage 30529 1726882621.25198: done checking for max_fail_percentage 30529 1726882621.25198: checking to see if all hosts have failed and the running result is not ok 30529 1726882621.25199: done checking to see if all hosts have failed 30529 1726882621.25200: getting the remaining hosts for this loop 30529 1726882621.25201: done getting the remaining hosts for this loop 30529 1726882621.25204: getting the next task for host managed_node1 30529 1726882621.25209: done getting next task for host managed_node1 30529 1726882621.25211: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882621.25215: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882621.25225: getting variables 30529 1726882621.25226: in VariableManager get_vars() 30529 1726882621.25243: Calling all_inventory to load vars for managed_node1 30529 1726882621.25245: Calling groups_inventory to load vars for managed_node1 30529 1726882621.25247: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.25251: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.25253: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.25256: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.27785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.29646: done with get_vars() 30529 1726882621.29672: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:01 -0400 (0:00:00.137) 0:00:35.323 ****** 30529 1726882621.29756: entering _queue_task() for managed_node1/setup 30529 1726882621.30316: worker is 1 (out of 1 available) 30529 1726882621.30326: exiting _queue_task() for managed_node1/setup 30529 1726882621.30336: done queuing things up, now waiting for results queue to drain 30529 1726882621.30338: waiting for pending results... 30529 1726882621.30530: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882621.30894: in run() - task 12673a56-9f93-b0f1-edc0-000000000d6d 30529 1726882621.30899: variable 'ansible_search_path' from source: unknown 30529 1726882621.30902: variable 'ansible_search_path' from source: unknown 30529 1726882621.30921: calling self._execute() 30529 1726882621.31111: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.31332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.31336: variable 'omit' from source: magic vars 30529 1726882621.31934: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.31959: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.32184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882621.34490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882621.34562: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882621.34701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882621.34705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882621.34708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882621.34770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882621.34812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882621.34847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882621.34884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882621.34905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882621.35025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882621.35028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882621.35031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882621.35067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882621.35083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882621.35502: variable '__network_required_facts' from source: role '' defaults 30529 1726882621.35505: variable 'ansible_facts' from source: unknown 30529 1726882621.37101: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882621.37112: when evaluation is False, skipping this task 30529 1726882621.37120: _execute() done 30529 1726882621.37126: dumping result to json 30529 1726882621.37133: done dumping result, returning 30529 1726882621.37149: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000000d6d] 30529 1726882621.37184: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d6d 30529 1726882621.37448: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d6d 30529 1726882621.37451: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882621.37501: no more pending results, returning what we have 30529 1726882621.37506: results queue empty 30529 1726882621.37507: checking for any_errors_fatal 30529 1726882621.37509: done checking for any_errors_fatal 30529 1726882621.37510: checking for max_fail_percentage 30529 1726882621.37512: done checking for max_fail_percentage 30529 1726882621.37513: checking to see if all hosts have failed and the running result is not ok 30529 1726882621.37514: done checking to see if all hosts have failed 30529 1726882621.37515: getting the remaining hosts for this loop 30529 1726882621.37516: done getting the remaining hosts for this loop 30529 1726882621.37521: getting the next task for host managed_node1 30529 1726882621.37535: done getting next task for host managed_node1 30529 1726882621.37540: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882621.37546: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882621.37569: getting variables 30529 1726882621.37571: in VariableManager get_vars() 30529 1726882621.37717: Calling all_inventory to load vars for managed_node1 30529 1726882621.37720: Calling groups_inventory to load vars for managed_node1 30529 1726882621.37722: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.37734: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.37737: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.37747: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.39823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.42940: done with get_vars() 30529 1726882621.42975: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:01 -0400 (0:00:00.133) 0:00:35.456 ****** 30529 1726882621.43082: entering _queue_task() for managed_node1/stat 30529 1726882621.43534: worker is 1 (out of 1 available) 30529 1726882621.43545: exiting _queue_task() for managed_node1/stat 30529 1726882621.43556: done queuing things up, now waiting for results queue to drain 30529 1726882621.43558: waiting for pending results... 30529 1726882621.43847: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882621.43930: in run() - task 12673a56-9f93-b0f1-edc0-000000000d6f 30529 1726882621.43958: variable 'ansible_search_path' from source: unknown 30529 1726882621.43968: variable 'ansible_search_path' from source: unknown 30529 1726882621.44011: calling self._execute() 30529 1726882621.44163: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.44166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.44171: variable 'omit' from source: magic vars 30529 1726882621.44545: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.44583: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.44764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882621.45049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882621.45145: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882621.45148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882621.45179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882621.45274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882621.45306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882621.45339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882621.45379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882621.45473: variable '__network_is_ostree' from source: set_fact 30529 1726882621.45583: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882621.45587: when evaluation is False, skipping this task 30529 1726882621.45590: _execute() done 30529 1726882621.45592: dumping result to json 30529 1726882621.45597: done dumping result, returning 30529 1726882621.45600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000000d6f] 30529 1726882621.45602: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d6f 30529 1726882621.45671: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d6f 30529 1726882621.45675: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882621.45739: no more pending results, returning what we have 30529 1726882621.45744: results queue empty 30529 1726882621.45745: checking for any_errors_fatal 30529 1726882621.45754: done checking for any_errors_fatal 30529 1726882621.45755: checking for max_fail_percentage 30529 1726882621.45757: done checking for max_fail_percentage 30529 1726882621.45758: checking to see if all hosts have failed and the running result is not ok 30529 1726882621.45760: done checking to see if all hosts have failed 30529 1726882621.45760: getting the remaining hosts for this loop 30529 1726882621.45763: done getting the remaining hosts for this loop 30529 1726882621.45767: getting the next task for host managed_node1 30529 1726882621.45778: done getting next task for host managed_node1 30529 1726882621.45781: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882621.45787: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882621.45812: getting variables 30529 1726882621.45814: in VariableManager get_vars() 30529 1726882621.45852: Calling all_inventory to load vars for managed_node1 30529 1726882621.45855: Calling groups_inventory to load vars for managed_node1 30529 1726882621.45858: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.45869: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.45873: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.45876: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.47612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.49953: done with get_vars() 30529 1726882621.49979: done getting variables 30529 1726882621.50169: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:01 -0400 (0:00:00.071) 0:00:35.528 ****** 30529 1726882621.50209: entering _queue_task() for managed_node1/set_fact 30529 1726882621.51062: worker is 1 (out of 1 available) 30529 1726882621.51076: exiting _queue_task() for managed_node1/set_fact 30529 1726882621.51089: done queuing things up, now waiting for results queue to drain 30529 1726882621.51091: waiting for pending results... 30529 1726882621.51614: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882621.51949: in run() - task 12673a56-9f93-b0f1-edc0-000000000d70 30529 1726882621.51984: variable 'ansible_search_path' from source: unknown 30529 1726882621.51988: variable 'ansible_search_path' from source: unknown 30529 1726882621.52004: calling self._execute() 30529 1726882621.52312: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.52315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.52318: variable 'omit' from source: magic vars 30529 1726882621.53084: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.53252: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.53621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882621.54275: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882621.54362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882621.54394: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882621.54436: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882621.54672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882621.54705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882621.54779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882621.54814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882621.55090: variable '__network_is_ostree' from source: set_fact 30529 1726882621.55473: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882621.55477: when evaluation is False, skipping this task 30529 1726882621.55480: _execute() done 30529 1726882621.55483: dumping result to json 30529 1726882621.55485: done dumping result, returning 30529 1726882621.55488: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000000d70] 30529 1726882621.55490: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d70 30529 1726882621.55560: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d70 30529 1726882621.55563: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882621.55628: no more pending results, returning what we have 30529 1726882621.55632: results queue empty 30529 1726882621.55633: checking for any_errors_fatal 30529 1726882621.55642: done checking for any_errors_fatal 30529 1726882621.55642: checking for max_fail_percentage 30529 1726882621.55644: done checking for max_fail_percentage 30529 1726882621.55645: checking to see if all hosts have failed and the running result is not ok 30529 1726882621.55646: done checking to see if all hosts have failed 30529 1726882621.55647: getting the remaining hosts for this loop 30529 1726882621.55648: done getting the remaining hosts for this loop 30529 1726882621.55652: getting the next task for host managed_node1 30529 1726882621.55666: done getting next task for host managed_node1 30529 1726882621.55669: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882621.55676: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882621.55703: getting variables 30529 1726882621.55705: in VariableManager get_vars() 30529 1726882621.55743: Calling all_inventory to load vars for managed_node1 30529 1726882621.55745: Calling groups_inventory to load vars for managed_node1 30529 1726882621.55747: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882621.55757: Calling all_plugins_play to load vars for managed_node1 30529 1726882621.55759: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882621.55762: Calling groups_plugins_play to load vars for managed_node1 30529 1726882621.58538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882621.61992: done with get_vars() 30529 1726882621.62132: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:01 -0400 (0:00:00.121) 0:00:35.649 ****** 30529 1726882621.62337: entering _queue_task() for managed_node1/service_facts 30529 1726882621.62911: worker is 1 (out of 1 available) 30529 1726882621.62924: exiting _queue_task() for managed_node1/service_facts 30529 1726882621.62937: done queuing things up, now waiting for results queue to drain 30529 1726882621.62938: waiting for pending results... 30529 1726882621.63425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882621.63433: in run() - task 12673a56-9f93-b0f1-edc0-000000000d72 30529 1726882621.63439: variable 'ansible_search_path' from source: unknown 30529 1726882621.63443: variable 'ansible_search_path' from source: unknown 30529 1726882621.63466: calling self._execute() 30529 1726882621.63575: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.63579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.63590: variable 'omit' from source: magic vars 30529 1726882621.64015: variable 'ansible_distribution_major_version' from source: facts 30529 1726882621.64029: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882621.64036: variable 'omit' from source: magic vars 30529 1726882621.64129: variable 'omit' from source: magic vars 30529 1726882621.64166: variable 'omit' from source: magic vars 30529 1726882621.64210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882621.64249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882621.64270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882621.64287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882621.64405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882621.64409: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882621.64412: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.64415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.64463: Set connection var ansible_shell_executable to /bin/sh 30529 1726882621.64467: Set connection var ansible_pipelining to False 30529 1726882621.64470: Set connection var ansible_shell_type to sh 30529 1726882621.64480: Set connection var ansible_timeout to 10 30529 1726882621.64483: Set connection var ansible_connection to ssh 30529 1726882621.64488: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882621.64520: variable 'ansible_shell_executable' from source: unknown 30529 1726882621.64524: variable 'ansible_connection' from source: unknown 30529 1726882621.64531: variable 'ansible_module_compression' from source: unknown 30529 1726882621.64533: variable 'ansible_shell_type' from source: unknown 30529 1726882621.64536: variable 'ansible_shell_executable' from source: unknown 30529 1726882621.64539: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882621.64543: variable 'ansible_pipelining' from source: unknown 30529 1726882621.64545: variable 'ansible_timeout' from source: unknown 30529 1726882621.64549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882621.64770: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882621.64781: variable 'omit' from source: magic vars 30529 1726882621.64786: starting attempt loop 30529 1726882621.64790: running the handler 30529 1726882621.64835: _low_level_execute_command(): starting 30529 1726882621.64838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882621.65991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882621.66009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.66145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882621.66148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882621.66168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882621.67880: stdout chunk (state=3): >>>/root <<< 30529 1726882621.67980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882621.68023: stderr chunk (state=3): >>><<< 30529 1726882621.68027: stdout chunk (state=3): >>><<< 30529 1726882621.68052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882621.68066: _low_level_execute_command(): starting 30529 1726882621.68201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523 `" && echo ansible-tmp-1726882621.6805222-32222-206887054505523="` echo /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523 `" ) && sleep 0' 30529 1726882621.68663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882621.68672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882621.68684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882621.68704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.68718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882621.68725: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882621.68737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.68861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882621.68866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882621.68869: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882621.68871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882621.68873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882621.68876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.68878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882621.68880: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882621.68882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.68883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882621.68905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882621.68916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882621.68987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882621.71015: stdout chunk (state=3): >>>ansible-tmp-1726882621.6805222-32222-206887054505523=/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523 <<< 30529 1726882621.71018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882621.71198: stderr chunk (state=3): >>><<< 30529 1726882621.71201: stdout chunk (state=3): >>><<< 30529 1726882621.71205: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882621.6805222-32222-206887054505523=/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882621.71209: variable 'ansible_module_compression' from source: unknown 30529 1726882621.71212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882621.71214: variable 'ansible_facts' from source: unknown 30529 1726882621.71459: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py 30529 1726882621.71820: Sending initial data 30529 1726882621.71823: Sent initial data (162 bytes) 30529 1726882621.73211: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882621.73230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882621.73302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882621.73373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882621.74941: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882621.75002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882621.75049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmprn_vcp_s /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py <<< 30529 1726882621.75053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py" <<< 30529 1726882621.75155: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmprn_vcp_s" to remote "/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py" <<< 30529 1726882621.76140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882621.76206: stderr chunk (state=3): >>><<< 30529 1726882621.76213: stdout chunk (state=3): >>><<< 30529 1726882621.76321: done transferring module to remote 30529 1726882621.76332: _low_level_execute_command(): starting 30529 1726882621.76337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/ /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py && sleep 0' 30529 1726882621.76988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882621.76999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882621.77010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882621.77023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.77035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882621.77043: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882621.77051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.77086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.77198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.77201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882621.77204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882621.77273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882621.79033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882621.79038: stdout chunk (state=3): >>><<< 30529 1726882621.79040: stderr chunk (state=3): >>><<< 30529 1726882621.79157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882621.79162: _low_level_execute_command(): starting 30529 1726882621.79164: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/AnsiballZ_service_facts.py && sleep 0' 30529 1726882621.79654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882621.79664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882621.79675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882621.79692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.79704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882621.79713: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882621.79727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.79735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882621.79827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882621.79832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882621.79834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882621.79836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882621.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882621.79839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882621.79841: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882621.79843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882621.79869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882621.79881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882621.79894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882621.79973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.33055: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882623.34232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882623.34492: stdout chunk (state=3): >>><<< 30529 1726882623.34502: stderr chunk (state=3): >>><<< 30529 1726882623.34508: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882623.36633: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882623.36647: _low_level_execute_command(): starting 30529 1726882623.36654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882621.6805222-32222-206887054505523/ > /dev/null 2>&1 && sleep 0' 30529 1726882623.37302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882623.37319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882623.37334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882623.37361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882623.37467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882623.37489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.37569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.39432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882623.39435: stdout chunk (state=3): >>><<< 30529 1726882623.39441: stderr chunk (state=3): >>><<< 30529 1726882623.39453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882623.39459: handler run complete 30529 1726882623.39570: variable 'ansible_facts' from source: unknown 30529 1726882623.39663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882623.39938: variable 'ansible_facts' from source: unknown 30529 1726882623.40019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882623.40132: attempt loop complete, returning result 30529 1726882623.40135: _execute() done 30529 1726882623.40138: dumping result to json 30529 1726882623.40171: done dumping result, returning 30529 1726882623.40182: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000000d72] 30529 1726882623.40184: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d72 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882623.40842: no more pending results, returning what we have 30529 1726882623.40845: results queue empty 30529 1726882623.40846: checking for any_errors_fatal 30529 1726882623.40850: done checking for any_errors_fatal 30529 1726882623.40851: checking for max_fail_percentage 30529 1726882623.40853: done checking for max_fail_percentage 30529 1726882623.40853: checking to see if all hosts have failed and the running result is not ok 30529 1726882623.40854: done checking to see if all hosts have failed 30529 1726882623.40855: getting the remaining hosts for this loop 30529 1726882623.40856: done getting the remaining hosts for this loop 30529 1726882623.40859: getting the next task for host managed_node1 30529 1726882623.40866: done getting next task for host managed_node1 30529 1726882623.40868: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882623.40874: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882623.40884: getting variables 30529 1726882623.40885: in VariableManager get_vars() 30529 1726882623.40959: Calling all_inventory to load vars for managed_node1 30529 1726882623.40963: Calling groups_inventory to load vars for managed_node1 30529 1726882623.40965: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882623.40971: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d72 30529 1726882623.40974: WORKER PROCESS EXITING 30529 1726882623.40983: Calling all_plugins_play to load vars for managed_node1 30529 1726882623.40986: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882623.40994: Calling groups_plugins_play to load vars for managed_node1 30529 1726882623.42246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882623.43124: done with get_vars() 30529 1726882623.43140: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:03 -0400 (0:00:01.808) 0:00:37.458 ****** 30529 1726882623.43217: entering _queue_task() for managed_node1/package_facts 30529 1726882623.43438: worker is 1 (out of 1 available) 30529 1726882623.43452: exiting _queue_task() for managed_node1/package_facts 30529 1726882623.43464: done queuing things up, now waiting for results queue to drain 30529 1726882623.43465: waiting for pending results... 30529 1726882623.43645: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882623.43758: in run() - task 12673a56-9f93-b0f1-edc0-000000000d73 30529 1726882623.43770: variable 'ansible_search_path' from source: unknown 30529 1726882623.43774: variable 'ansible_search_path' from source: unknown 30529 1726882623.43815: calling self._execute() 30529 1726882623.44114: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882623.44118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882623.44121: variable 'omit' from source: magic vars 30529 1726882623.44852: variable 'ansible_distribution_major_version' from source: facts 30529 1726882623.44861: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882623.44868: variable 'omit' from source: magic vars 30529 1726882623.44944: variable 'omit' from source: magic vars 30529 1726882623.44979: variable 'omit' from source: magic vars 30529 1726882623.45218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882623.45252: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882623.45271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882623.45294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882623.45308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882623.45398: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882623.45401: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882623.45404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882623.45647: Set connection var ansible_shell_executable to /bin/sh 30529 1726882623.45651: Set connection var ansible_pipelining to False 30529 1726882623.45653: Set connection var ansible_shell_type to sh 30529 1726882623.45698: Set connection var ansible_timeout to 10 30529 1726882623.45701: Set connection var ansible_connection to ssh 30529 1726882623.45704: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882623.45706: variable 'ansible_shell_executable' from source: unknown 30529 1726882623.45708: variable 'ansible_connection' from source: unknown 30529 1726882623.45711: variable 'ansible_module_compression' from source: unknown 30529 1726882623.45713: variable 'ansible_shell_type' from source: unknown 30529 1726882623.45715: variable 'ansible_shell_executable' from source: unknown 30529 1726882623.45722: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882623.45724: variable 'ansible_pipelining' from source: unknown 30529 1726882623.45726: variable 'ansible_timeout' from source: unknown 30529 1726882623.45734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882623.46158: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882623.46168: variable 'omit' from source: magic vars 30529 1726882623.46170: starting attempt loop 30529 1726882623.46173: running the handler 30529 1726882623.46175: _low_level_execute_command(): starting 30529 1726882623.46177: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882623.47349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882623.47353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.47359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882623.47362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.47389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882623.47399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882623.47420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.47484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.49145: stdout chunk (state=3): >>>/root <<< 30529 1726882623.49279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882623.49290: stdout chunk (state=3): >>><<< 30529 1726882623.49311: stderr chunk (state=3): >>><<< 30529 1726882623.49417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882623.49421: _low_level_execute_command(): starting 30529 1726882623.49424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620 `" && echo ansible-tmp-1726882623.4933057-32290-274850674508620="` echo /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620 `" ) && sleep 0' 30529 1726882623.50050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882623.50053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.50120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882623.50147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882623.50172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.50267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.52132: stdout chunk (state=3): >>>ansible-tmp-1726882623.4933057-32290-274850674508620=/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620 <<< 30529 1726882623.52288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882623.52291: stdout chunk (state=3): >>><<< 30529 1726882623.52296: stderr chunk (state=3): >>><<< 30529 1726882623.52500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882623.4933057-32290-274850674508620=/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882623.52504: variable 'ansible_module_compression' from source: unknown 30529 1726882623.52507: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882623.52510: variable 'ansible_facts' from source: unknown 30529 1726882623.53107: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py 30529 1726882623.53349: Sending initial data 30529 1726882623.53403: Sent initial data (162 bytes) 30529 1726882623.54024: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882623.54206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882623.54519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.54563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.56098: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882623.56103: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882623.56147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882623.56198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp03myl243 /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py <<< 30529 1726882623.56202: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py" <<< 30529 1726882623.56234: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp03myl243" to remote "/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py" <<< 30529 1726882623.57883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882623.57923: stderr chunk (state=3): >>><<< 30529 1726882623.57928: stdout chunk (state=3): >>><<< 30529 1726882623.57963: done transferring module to remote 30529 1726882623.57971: _low_level_execute_command(): starting 30529 1726882623.57976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/ /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py && sleep 0' 30529 1726882623.58558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.58563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882623.58565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.58606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882623.60476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882623.60479: stderr chunk (state=3): >>><<< 30529 1726882623.60481: stdout chunk (state=3): >>><<< 30529 1726882623.60487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882623.60489: _low_level_execute_command(): starting 30529 1726882623.60491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/AnsiballZ_package_facts.py && sleep 0' 30529 1726882623.60940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882623.60945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882623.60977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.60980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882623.60982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882623.60985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882623.61042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882623.61045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882623.61098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882624.04808: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30529 1726882624.04883: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882624.05050: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882624.06930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882624.06941: stderr chunk (state=3): >>><<< 30529 1726882624.06962: stdout chunk (state=3): >>><<< 30529 1726882624.07016: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882624.09306: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882624.09332: _low_level_execute_command(): starting 30529 1726882624.09341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882623.4933057-32290-274850674508620/ > /dev/null 2>&1 && sleep 0' 30529 1726882624.10031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882624.10045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882624.10066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882624.10085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882624.10174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882624.10206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882624.10224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882624.10247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882624.10337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882624.12217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882624.12221: stdout chunk (state=3): >>><<< 30529 1726882624.12227: stderr chunk (state=3): >>><<< 30529 1726882624.12248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882624.12254: handler run complete 30529 1726882624.13110: variable 'ansible_facts' from source: unknown 30529 1726882624.13714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.15512: variable 'ansible_facts' from source: unknown 30529 1726882624.15944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.16646: attempt loop complete, returning result 30529 1726882624.16657: _execute() done 30529 1726882624.16660: dumping result to json 30529 1726882624.16866: done dumping result, returning 30529 1726882624.16878: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000000d73] 30529 1726882624.16881: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d73 30529 1726882624.19222: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d73 30529 1726882624.19225: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882624.19382: no more pending results, returning what we have 30529 1726882624.19385: results queue empty 30529 1726882624.19386: checking for any_errors_fatal 30529 1726882624.19396: done checking for any_errors_fatal 30529 1726882624.19397: checking for max_fail_percentage 30529 1726882624.19399: done checking for max_fail_percentage 30529 1726882624.19400: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.19401: done checking to see if all hosts have failed 30529 1726882624.19401: getting the remaining hosts for this loop 30529 1726882624.19403: done getting the remaining hosts for this loop 30529 1726882624.19407: getting the next task for host managed_node1 30529 1726882624.19416: done getting next task for host managed_node1 30529 1726882624.19420: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882624.19425: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.19437: getting variables 30529 1726882624.19438: in VariableManager get_vars() 30529 1726882624.19468: Calling all_inventory to load vars for managed_node1 30529 1726882624.19471: Calling groups_inventory to load vars for managed_node1 30529 1726882624.19473: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.19483: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.19486: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.19488: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.20819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.22769: done with get_vars() 30529 1726882624.22819: done getting variables 30529 1726882624.22920: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:04 -0400 (0:00:00.797) 0:00:38.255 ****** 30529 1726882624.22973: entering _queue_task() for managed_node1/debug 30529 1726882624.23529: worker is 1 (out of 1 available) 30529 1726882624.23540: exiting _queue_task() for managed_node1/debug 30529 1726882624.23553: done queuing things up, now waiting for results queue to drain 30529 1726882624.23554: waiting for pending results... 30529 1726882624.23799: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882624.23887: in run() - task 12673a56-9f93-b0f1-edc0-000000000d17 30529 1726882624.23916: variable 'ansible_search_path' from source: unknown 30529 1726882624.23924: variable 'ansible_search_path' from source: unknown 30529 1726882624.23965: calling self._execute() 30529 1726882624.24078: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.24096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.24199: variable 'omit' from source: magic vars 30529 1726882624.24517: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.24541: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.24557: variable 'omit' from source: magic vars 30529 1726882624.24630: variable 'omit' from source: magic vars 30529 1726882624.24743: variable 'network_provider' from source: set_fact 30529 1726882624.24798: variable 'omit' from source: magic vars 30529 1726882624.24818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882624.24857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882624.24888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882624.24986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882624.24992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882624.24999: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882624.25001: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.25004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.25084: Set connection var ansible_shell_executable to /bin/sh 30529 1726882624.25106: Set connection var ansible_pipelining to False 30529 1726882624.25112: Set connection var ansible_shell_type to sh 30529 1726882624.25125: Set connection var ansible_timeout to 10 30529 1726882624.25130: Set connection var ansible_connection to ssh 30529 1726882624.25138: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882624.25163: variable 'ansible_shell_executable' from source: unknown 30529 1726882624.25171: variable 'ansible_connection' from source: unknown 30529 1726882624.25177: variable 'ansible_module_compression' from source: unknown 30529 1726882624.25184: variable 'ansible_shell_type' from source: unknown 30529 1726882624.25206: variable 'ansible_shell_executable' from source: unknown 30529 1726882624.25209: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.25211: variable 'ansible_pipelining' from source: unknown 30529 1726882624.25218: variable 'ansible_timeout' from source: unknown 30529 1726882624.25315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.25373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882624.25388: variable 'omit' from source: magic vars 30529 1726882624.25403: starting attempt loop 30529 1726882624.25410: running the handler 30529 1726882624.25467: handler run complete 30529 1726882624.25547: attempt loop complete, returning result 30529 1726882624.25554: _execute() done 30529 1726882624.25801: dumping result to json 30529 1726882624.25804: done dumping result, returning 30529 1726882624.25807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000000d17] 30529 1726882624.25809: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d17 30529 1726882624.25874: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d17 30529 1726882624.25877: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882624.25972: no more pending results, returning what we have 30529 1726882624.25976: results queue empty 30529 1726882624.25977: checking for any_errors_fatal 30529 1726882624.25991: done checking for any_errors_fatal 30529 1726882624.25994: checking for max_fail_percentage 30529 1726882624.25997: done checking for max_fail_percentage 30529 1726882624.25999: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.26000: done checking to see if all hosts have failed 30529 1726882624.26000: getting the remaining hosts for this loop 30529 1726882624.26005: done getting the remaining hosts for this loop 30529 1726882624.26009: getting the next task for host managed_node1 30529 1726882624.26019: done getting next task for host managed_node1 30529 1726882624.26024: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882624.26030: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.26043: getting variables 30529 1726882624.26045: in VariableManager get_vars() 30529 1726882624.26082: Calling all_inventory to load vars for managed_node1 30529 1726882624.26086: Calling groups_inventory to load vars for managed_node1 30529 1726882624.26088: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.26503: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.26506: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.26509: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.28296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.30719: done with get_vars() 30529 1726882624.30741: done getting variables 30529 1726882624.31011: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:04 -0400 (0:00:00.080) 0:00:38.336 ****** 30529 1726882624.31052: entering _queue_task() for managed_node1/fail 30529 1726882624.31830: worker is 1 (out of 1 available) 30529 1726882624.31842: exiting _queue_task() for managed_node1/fail 30529 1726882624.31855: done queuing things up, now waiting for results queue to drain 30529 1726882624.31857: waiting for pending results... 30529 1726882624.32466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882624.32811: in run() - task 12673a56-9f93-b0f1-edc0-000000000d18 30529 1726882624.32834: variable 'ansible_search_path' from source: unknown 30529 1726882624.32843: variable 'ansible_search_path' from source: unknown 30529 1726882624.32889: calling self._execute() 30529 1726882624.33094: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.33099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.33102: variable 'omit' from source: magic vars 30529 1726882624.33407: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.33431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.33558: variable 'network_state' from source: role '' defaults 30529 1726882624.33574: Evaluated conditional (network_state != {}): False 30529 1726882624.33582: when evaluation is False, skipping this task 30529 1726882624.33590: _execute() done 30529 1726882624.33600: dumping result to json 30529 1726882624.33609: done dumping result, returning 30529 1726882624.33621: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000000d18] 30529 1726882624.33637: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d18 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882624.33789: no more pending results, returning what we have 30529 1726882624.33795: results queue empty 30529 1726882624.33797: checking for any_errors_fatal 30529 1726882624.33804: done checking for any_errors_fatal 30529 1726882624.33805: checking for max_fail_percentage 30529 1726882624.33807: done checking for max_fail_percentage 30529 1726882624.33808: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.33809: done checking to see if all hosts have failed 30529 1726882624.33810: getting the remaining hosts for this loop 30529 1726882624.33812: done getting the remaining hosts for this loop 30529 1726882624.33816: getting the next task for host managed_node1 30529 1726882624.33826: done getting next task for host managed_node1 30529 1726882624.33831: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882624.33837: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.33862: getting variables 30529 1726882624.33864: in VariableManager get_vars() 30529 1726882624.34199: Calling all_inventory to load vars for managed_node1 30529 1726882624.34202: Calling groups_inventory to load vars for managed_node1 30529 1726882624.34205: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.34210: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d18 30529 1726882624.34214: WORKER PROCESS EXITING 30529 1726882624.34222: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.34226: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.34229: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.36876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.47119: done with get_vars() 30529 1726882624.47145: done getting variables 30529 1726882624.47188: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:04 -0400 (0:00:00.161) 0:00:38.498 ****** 30529 1726882624.47223: entering _queue_task() for managed_node1/fail 30529 1726882624.47575: worker is 1 (out of 1 available) 30529 1726882624.47587: exiting _queue_task() for managed_node1/fail 30529 1726882624.47801: done queuing things up, now waiting for results queue to drain 30529 1726882624.47804: waiting for pending results... 30529 1726882624.47885: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882624.48140: in run() - task 12673a56-9f93-b0f1-edc0-000000000d19 30529 1726882624.48149: variable 'ansible_search_path' from source: unknown 30529 1726882624.48152: variable 'ansible_search_path' from source: unknown 30529 1726882624.48157: calling self._execute() 30529 1726882624.48242: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.48257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.48271: variable 'omit' from source: magic vars 30529 1726882624.48667: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.48694: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.48823: variable 'network_state' from source: role '' defaults 30529 1726882624.48844: Evaluated conditional (network_state != {}): False 30529 1726882624.48901: when evaluation is False, skipping this task 30529 1726882624.48905: _execute() done 30529 1726882624.48907: dumping result to json 30529 1726882624.48910: done dumping result, returning 30529 1726882624.48914: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000000d19] 30529 1726882624.48917: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d19 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882624.49244: no more pending results, returning what we have 30529 1726882624.49247: results queue empty 30529 1726882624.49249: checking for any_errors_fatal 30529 1726882624.49255: done checking for any_errors_fatal 30529 1726882624.49256: checking for max_fail_percentage 30529 1726882624.49258: done checking for max_fail_percentage 30529 1726882624.49259: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.49260: done checking to see if all hosts have failed 30529 1726882624.49261: getting the remaining hosts for this loop 30529 1726882624.49263: done getting the remaining hosts for this loop 30529 1726882624.49267: getting the next task for host managed_node1 30529 1726882624.49275: done getting next task for host managed_node1 30529 1726882624.49279: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882624.49285: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.49306: getting variables 30529 1726882624.49308: in VariableManager get_vars() 30529 1726882624.49344: Calling all_inventory to load vars for managed_node1 30529 1726882624.49347: Calling groups_inventory to load vars for managed_node1 30529 1726882624.49349: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.49361: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.49364: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.49367: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.49906: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d19 30529 1726882624.49909: WORKER PROCESS EXITING 30529 1726882624.50811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.52330: done with get_vars() 30529 1726882624.52354: done getting variables 30529 1726882624.52455: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:04 -0400 (0:00:00.052) 0:00:38.551 ****** 30529 1726882624.52520: entering _queue_task() for managed_node1/fail 30529 1726882624.53172: worker is 1 (out of 1 available) 30529 1726882624.53188: exiting _queue_task() for managed_node1/fail 30529 1726882624.53203: done queuing things up, now waiting for results queue to drain 30529 1726882624.53205: waiting for pending results... 30529 1726882624.53375: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882624.53557: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1a 30529 1726882624.53590: variable 'ansible_search_path' from source: unknown 30529 1726882624.53611: variable 'ansible_search_path' from source: unknown 30529 1726882624.53677: calling self._execute() 30529 1726882624.53828: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.53867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.53898: variable 'omit' from source: magic vars 30529 1726882624.54448: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.54471: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.54785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882624.57813: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882624.58226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882624.58269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882624.58399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882624.58403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882624.58425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.58460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.58491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.58542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.58562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.58660: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.58683: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882624.58825: variable 'ansible_distribution' from source: facts 30529 1726882624.58835: variable '__network_rh_distros' from source: role '' defaults 30529 1726882624.59000: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882624.59099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.59136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.59166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.59212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.59238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.59291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.59323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.59353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.59392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.59410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.59455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.59479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.59507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.59548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.59564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.59851: variable 'network_connections' from source: include params 30529 1726882624.59905: variable 'interface' from source: play vars 30529 1726882624.60099: variable 'interface' from source: play vars 30529 1726882624.60101: variable 'network_state' from source: role '' defaults 30529 1726882624.60205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882624.60486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882624.60579: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882624.60661: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882624.60756: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882624.60863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882624.60884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882624.60971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.61028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882624.61072: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882624.61075: when evaluation is False, skipping this task 30529 1726882624.61078: _execute() done 30529 1726882624.61183: dumping result to json 30529 1726882624.61187: done dumping result, returning 30529 1726882624.61190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000000d1a] 30529 1726882624.61192: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1a 30529 1726882624.61279: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1a 30529 1726882624.61283: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882624.61362: no more pending results, returning what we have 30529 1726882624.61366: results queue empty 30529 1726882624.61367: checking for any_errors_fatal 30529 1726882624.61373: done checking for any_errors_fatal 30529 1726882624.61374: checking for max_fail_percentage 30529 1726882624.61376: done checking for max_fail_percentage 30529 1726882624.61377: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.61378: done checking to see if all hosts have failed 30529 1726882624.61379: getting the remaining hosts for this loop 30529 1726882624.61380: done getting the remaining hosts for this loop 30529 1726882624.61385: getting the next task for host managed_node1 30529 1726882624.61396: done getting next task for host managed_node1 30529 1726882624.61401: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882624.61409: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.61430: getting variables 30529 1726882624.61432: in VariableManager get_vars() 30529 1726882624.61478: Calling all_inventory to load vars for managed_node1 30529 1726882624.61481: Calling groups_inventory to load vars for managed_node1 30529 1726882624.61485: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.61729: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.61740: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.61745: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.63662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.65657: done with get_vars() 30529 1726882624.65747: done getting variables 30529 1726882624.65869: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:04 -0400 (0:00:00.133) 0:00:38.685 ****** 30529 1726882624.65906: entering _queue_task() for managed_node1/dnf 30529 1726882624.66238: worker is 1 (out of 1 available) 30529 1726882624.66252: exiting _queue_task() for managed_node1/dnf 30529 1726882624.66292: done queuing things up, now waiting for results queue to drain 30529 1726882624.66296: waiting for pending results... 30529 1726882624.66955: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882624.66960: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1b 30529 1726882624.66964: variable 'ansible_search_path' from source: unknown 30529 1726882624.66967: variable 'ansible_search_path' from source: unknown 30529 1726882624.67104: calling self._execute() 30529 1726882624.67265: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.67414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.67417: variable 'omit' from source: magic vars 30529 1726882624.67991: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.68097: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.68211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882624.71139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882624.71226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882624.71296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882624.71333: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882624.71369: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882624.71526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.71640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.71741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.71792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.71860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.72211: variable 'ansible_distribution' from source: facts 30529 1726882624.72215: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.72218: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882624.72221: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882624.72669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.72700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.72729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.72777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.72799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.72843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.72938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.73016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.73111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.73132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.73175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.73234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.73267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.73357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.73412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.73687: variable 'network_connections' from source: include params 30529 1726882624.73690: variable 'interface' from source: play vars 30529 1726882624.73722: variable 'interface' from source: play vars 30529 1726882624.73824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882624.74000: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882624.74117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882624.74120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882624.74123: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882624.74156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882624.74183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882624.74373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.74504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882624.74507: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882624.75146: variable 'network_connections' from source: include params 30529 1726882624.75180: variable 'interface' from source: play vars 30529 1726882624.75284: variable 'interface' from source: play vars 30529 1726882624.75318: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882624.75327: when evaluation is False, skipping this task 30529 1726882624.75335: _execute() done 30529 1726882624.75341: dumping result to json 30529 1726882624.75348: done dumping result, returning 30529 1726882624.75360: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000d1b] 30529 1726882624.75369: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1b skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882624.75541: no more pending results, returning what we have 30529 1726882624.75546: results queue empty 30529 1726882624.75547: checking for any_errors_fatal 30529 1726882624.75556: done checking for any_errors_fatal 30529 1726882624.75557: checking for max_fail_percentage 30529 1726882624.75559: done checking for max_fail_percentage 30529 1726882624.75560: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.75561: done checking to see if all hosts have failed 30529 1726882624.75562: getting the remaining hosts for this loop 30529 1726882624.75564: done getting the remaining hosts for this loop 30529 1726882624.75568: getting the next task for host managed_node1 30529 1726882624.75578: done getting next task for host managed_node1 30529 1726882624.75582: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882624.75588: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.75615: getting variables 30529 1726882624.75618: in VariableManager get_vars() 30529 1726882624.75657: Calling all_inventory to load vars for managed_node1 30529 1726882624.75660: Calling groups_inventory to load vars for managed_node1 30529 1726882624.75662: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.75674: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.75678: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.75681: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.76513: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1b 30529 1726882624.76516: WORKER PROCESS EXITING 30529 1726882624.78222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.81409: done with get_vars() 30529 1726882624.81443: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882624.81668: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:04 -0400 (0:00:00.157) 0:00:38.843 ****** 30529 1726882624.81881: entering _queue_task() for managed_node1/yum 30529 1726882624.83256: worker is 1 (out of 1 available) 30529 1726882624.83269: exiting _queue_task() for managed_node1/yum 30529 1726882624.83287: done queuing things up, now waiting for results queue to drain 30529 1726882624.83289: waiting for pending results... 30529 1726882624.83463: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882624.83645: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1c 30529 1726882624.83665: variable 'ansible_search_path' from source: unknown 30529 1726882624.83670: variable 'ansible_search_path' from source: unknown 30529 1726882624.83711: calling self._execute() 30529 1726882624.83817: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.83825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.83852: variable 'omit' from source: magic vars 30529 1726882624.84312: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.84316: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.84526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882624.87470: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882624.87600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882624.87604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882624.87621: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882624.87647: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882624.87727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.87758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.87783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.87831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.87844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.87941: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.88046: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882624.88049: when evaluation is False, skipping this task 30529 1726882624.88051: _execute() done 30529 1726882624.88053: dumping result to json 30529 1726882624.88055: done dumping result, returning 30529 1726882624.88057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000d1c] 30529 1726882624.88059: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1c 30529 1726882624.88129: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1c 30529 1726882624.88131: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882624.88178: no more pending results, returning what we have 30529 1726882624.88181: results queue empty 30529 1726882624.88182: checking for any_errors_fatal 30529 1726882624.88188: done checking for any_errors_fatal 30529 1726882624.88191: checking for max_fail_percentage 30529 1726882624.88195: done checking for max_fail_percentage 30529 1726882624.88196: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.88197: done checking to see if all hosts have failed 30529 1726882624.88198: getting the remaining hosts for this loop 30529 1726882624.88200: done getting the remaining hosts for this loop 30529 1726882624.88203: getting the next task for host managed_node1 30529 1726882624.88212: done getting next task for host managed_node1 30529 1726882624.88216: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882624.88221: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.88240: getting variables 30529 1726882624.88242: in VariableManager get_vars() 30529 1726882624.88276: Calling all_inventory to load vars for managed_node1 30529 1726882624.88278: Calling groups_inventory to load vars for managed_node1 30529 1726882624.88280: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.88291: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.88401: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.88405: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.90013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882624.91604: done with get_vars() 30529 1726882624.91629: done getting variables 30529 1726882624.91688: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:04 -0400 (0:00:00.100) 0:00:38.943 ****** 30529 1726882624.91732: entering _queue_task() for managed_node1/fail 30529 1726882624.92112: worker is 1 (out of 1 available) 30529 1726882624.92126: exiting _queue_task() for managed_node1/fail 30529 1726882624.92143: done queuing things up, now waiting for results queue to drain 30529 1726882624.92145: waiting for pending results... 30529 1726882624.92511: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882624.92899: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1d 30529 1726882624.92903: variable 'ansible_search_path' from source: unknown 30529 1726882624.92906: variable 'ansible_search_path' from source: unknown 30529 1726882624.92909: calling self._execute() 30529 1726882624.92934: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882624.92947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882624.92958: variable 'omit' from source: magic vars 30529 1726882624.93319: variable 'ansible_distribution_major_version' from source: facts 30529 1726882624.93330: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882624.93443: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882624.93626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882624.95762: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882624.95837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882624.95870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882624.95905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882624.95929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882624.96005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.96034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.96056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.96096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.96120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.96198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.96201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.96204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.96227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.96245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.96284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882624.96314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882624.96340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.96377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882624.96498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882624.96562: variable 'network_connections' from source: include params 30529 1726882624.96577: variable 'interface' from source: play vars 30529 1726882624.96647: variable 'interface' from source: play vars 30529 1726882624.96734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882624.96898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882624.96950: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882624.96986: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882624.97024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882624.97071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882624.97102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882624.97132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882624.97163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882624.97220: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882624.97456: variable 'network_connections' from source: include params 30529 1726882624.97467: variable 'interface' from source: play vars 30529 1726882624.97533: variable 'interface' from source: play vars 30529 1726882624.97562: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882624.97570: when evaluation is False, skipping this task 30529 1726882624.97599: _execute() done 30529 1726882624.97602: dumping result to json 30529 1726882624.97604: done dumping result, returning 30529 1726882624.97606: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000d1d] 30529 1726882624.97614: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1d 30529 1726882624.97767: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1d 30529 1726882624.97771: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882624.97823: no more pending results, returning what we have 30529 1726882624.97826: results queue empty 30529 1726882624.97827: checking for any_errors_fatal 30529 1726882624.97832: done checking for any_errors_fatal 30529 1726882624.97833: checking for max_fail_percentage 30529 1726882624.97835: done checking for max_fail_percentage 30529 1726882624.97835: checking to see if all hosts have failed and the running result is not ok 30529 1726882624.97836: done checking to see if all hosts have failed 30529 1726882624.97837: getting the remaining hosts for this loop 30529 1726882624.97839: done getting the remaining hosts for this loop 30529 1726882624.97842: getting the next task for host managed_node1 30529 1726882624.97853: done getting next task for host managed_node1 30529 1726882624.97856: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882624.97861: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882624.97880: getting variables 30529 1726882624.97882: in VariableManager get_vars() 30529 1726882624.97918: Calling all_inventory to load vars for managed_node1 30529 1726882624.97920: Calling groups_inventory to load vars for managed_node1 30529 1726882624.97923: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882624.97932: Calling all_plugins_play to load vars for managed_node1 30529 1726882624.97935: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882624.97937: Calling groups_plugins_play to load vars for managed_node1 30529 1726882624.99463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882625.02551: done with get_vars() 30529 1726882625.02573: done getting variables 30529 1726882625.02658: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:05 -0400 (0:00:00.109) 0:00:39.053 ****** 30529 1726882625.02733: entering _queue_task() for managed_node1/package 30529 1726882625.03137: worker is 1 (out of 1 available) 30529 1726882625.03153: exiting _queue_task() for managed_node1/package 30529 1726882625.03165: done queuing things up, now waiting for results queue to drain 30529 1726882625.03167: waiting for pending results... 30529 1726882625.03749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882625.03814: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1e 30529 1726882625.03842: variable 'ansible_search_path' from source: unknown 30529 1726882625.04064: variable 'ansible_search_path' from source: unknown 30529 1726882625.04069: calling self._execute() 30529 1726882625.04208: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.04218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.04230: variable 'omit' from source: magic vars 30529 1726882625.05063: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.05156: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882625.05458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882625.05897: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882625.05975: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882625.06037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882625.06121: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882625.06250: variable 'network_packages' from source: role '' defaults 30529 1726882625.06361: variable '__network_provider_setup' from source: role '' defaults 30529 1726882625.06376: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882625.06441: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882625.06458: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882625.06532: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882625.06702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882625.09979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882625.10027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882625.10125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882625.10162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882625.10398: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882625.10438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.10499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.10738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.10742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.10745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.10820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.10871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.10930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.11174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.11177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.11574: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882625.11919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.11971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.11998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.12035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.12058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.12182: variable 'ansible_python' from source: facts 30529 1726882625.12207: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882625.12318: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882625.12415: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882625.12549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.12578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.12616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.12657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.12677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.12728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.12760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.12783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.12830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.12849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.13010: variable 'network_connections' from source: include params 30529 1726882625.13023: variable 'interface' from source: play vars 30529 1726882625.13147: variable 'interface' from source: play vars 30529 1726882625.13227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882625.13267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882625.13306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.13342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882625.13408: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882625.13717: variable 'network_connections' from source: include params 30529 1726882625.13728: variable 'interface' from source: play vars 30529 1726882625.13861: variable 'interface' from source: play vars 30529 1726882625.14002: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882625.14005: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882625.14535: variable 'network_connections' from source: include params 30529 1726882625.14547: variable 'interface' from source: play vars 30529 1726882625.14621: variable 'interface' from source: play vars 30529 1726882625.14649: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882625.14736: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882625.15099: variable 'network_connections' from source: include params 30529 1726882625.15103: variable 'interface' from source: play vars 30529 1726882625.15124: variable 'interface' from source: play vars 30529 1726882625.15187: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882625.15258: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882625.15271: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882625.15336: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882625.15572: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882625.16135: variable 'network_connections' from source: include params 30529 1726882625.16199: variable 'interface' from source: play vars 30529 1726882625.16203: variable 'interface' from source: play vars 30529 1726882625.16213: variable 'ansible_distribution' from source: facts 30529 1726882625.16226: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.16235: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.16252: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882625.16420: variable 'ansible_distribution' from source: facts 30529 1726882625.16428: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.16441: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.16458: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882625.16617: variable 'ansible_distribution' from source: facts 30529 1726882625.16625: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.16632: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.16674: variable 'network_provider' from source: set_fact 30529 1726882625.16769: variable 'ansible_facts' from source: unknown 30529 1726882625.17464: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882625.17474: when evaluation is False, skipping this task 30529 1726882625.17481: _execute() done 30529 1726882625.17487: dumping result to json 30529 1726882625.17497: done dumping result, returning 30529 1726882625.17513: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000000d1e] 30529 1726882625.17531: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1e skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882625.17772: no more pending results, returning what we have 30529 1726882625.17776: results queue empty 30529 1726882625.17778: checking for any_errors_fatal 30529 1726882625.17785: done checking for any_errors_fatal 30529 1726882625.17786: checking for max_fail_percentage 30529 1726882625.17788: done checking for max_fail_percentage 30529 1726882625.17789: checking to see if all hosts have failed and the running result is not ok 30529 1726882625.17790: done checking to see if all hosts have failed 30529 1726882625.17791: getting the remaining hosts for this loop 30529 1726882625.17794: done getting the remaining hosts for this loop 30529 1726882625.17799: getting the next task for host managed_node1 30529 1726882625.17809: done getting next task for host managed_node1 30529 1726882625.17813: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882625.17824: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882625.17849: getting variables 30529 1726882625.17851: in VariableManager get_vars() 30529 1726882625.18241: Calling all_inventory to load vars for managed_node1 30529 1726882625.18244: Calling groups_inventory to load vars for managed_node1 30529 1726882625.18252: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882625.18264: Calling all_plugins_play to load vars for managed_node1 30529 1726882625.18267: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882625.18270: Calling groups_plugins_play to load vars for managed_node1 30529 1726882625.19133: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1e 30529 1726882625.19136: WORKER PROCESS EXITING 30529 1726882625.20137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882625.21915: done with get_vars() 30529 1726882625.21945: done getting variables 30529 1726882625.22035: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:05 -0400 (0:00:00.193) 0:00:39.246 ****** 30529 1726882625.22077: entering _queue_task() for managed_node1/package 30529 1726882625.22603: worker is 1 (out of 1 available) 30529 1726882625.22618: exiting _queue_task() for managed_node1/package 30529 1726882625.22632: done queuing things up, now waiting for results queue to drain 30529 1726882625.22633: waiting for pending results... 30529 1726882625.23105: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882625.23365: in run() - task 12673a56-9f93-b0f1-edc0-000000000d1f 30529 1726882625.23387: variable 'ansible_search_path' from source: unknown 30529 1726882625.23397: variable 'ansible_search_path' from source: unknown 30529 1726882625.23441: calling self._execute() 30529 1726882625.23546: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.23561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.23580: variable 'omit' from source: magic vars 30529 1726882625.23946: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.23961: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882625.24082: variable 'network_state' from source: role '' defaults 30529 1726882625.24332: Evaluated conditional (network_state != {}): False 30529 1726882625.24336: when evaluation is False, skipping this task 30529 1726882625.24338: _execute() done 30529 1726882625.24341: dumping result to json 30529 1726882625.24343: done dumping result, returning 30529 1726882625.24345: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000d1f] 30529 1726882625.24348: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1f 30529 1726882625.24526: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d1f 30529 1726882625.24530: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882625.24577: no more pending results, returning what we have 30529 1726882625.24581: results queue empty 30529 1726882625.24582: checking for any_errors_fatal 30529 1726882625.24591: done checking for any_errors_fatal 30529 1726882625.24592: checking for max_fail_percentage 30529 1726882625.24596: done checking for max_fail_percentage 30529 1726882625.24597: checking to see if all hosts have failed and the running result is not ok 30529 1726882625.24598: done checking to see if all hosts have failed 30529 1726882625.24599: getting the remaining hosts for this loop 30529 1726882625.24600: done getting the remaining hosts for this loop 30529 1726882625.24604: getting the next task for host managed_node1 30529 1726882625.24614: done getting next task for host managed_node1 30529 1726882625.24617: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882625.24623: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882625.24646: getting variables 30529 1726882625.24649: in VariableManager get_vars() 30529 1726882625.24682: Calling all_inventory to load vars for managed_node1 30529 1726882625.24685: Calling groups_inventory to load vars for managed_node1 30529 1726882625.24687: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882625.24812: Calling all_plugins_play to load vars for managed_node1 30529 1726882625.24817: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882625.24820: Calling groups_plugins_play to load vars for managed_node1 30529 1726882625.26618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882625.28463: done with get_vars() 30529 1726882625.28491: done getting variables 30529 1726882625.28615: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:05 -0400 (0:00:00.065) 0:00:39.312 ****** 30529 1726882625.28656: entering _queue_task() for managed_node1/package 30529 1726882625.29106: worker is 1 (out of 1 available) 30529 1726882625.29119: exiting _queue_task() for managed_node1/package 30529 1726882625.29133: done queuing things up, now waiting for results queue to drain 30529 1726882625.29135: waiting for pending results... 30529 1726882625.29425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882625.29567: in run() - task 12673a56-9f93-b0f1-edc0-000000000d20 30529 1726882625.29588: variable 'ansible_search_path' from source: unknown 30529 1726882625.29600: variable 'ansible_search_path' from source: unknown 30529 1726882625.29649: calling self._execute() 30529 1726882625.29754: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.29766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.29782: variable 'omit' from source: magic vars 30529 1726882625.30190: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.30210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882625.30341: variable 'network_state' from source: role '' defaults 30529 1726882625.30398: Evaluated conditional (network_state != {}): False 30529 1726882625.30401: when evaluation is False, skipping this task 30529 1726882625.30404: _execute() done 30529 1726882625.30407: dumping result to json 30529 1726882625.30409: done dumping result, returning 30529 1726882625.30412: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000000d20] 30529 1726882625.30414: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d20 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882625.30656: no more pending results, returning what we have 30529 1726882625.30660: results queue empty 30529 1726882625.30661: checking for any_errors_fatal 30529 1726882625.30667: done checking for any_errors_fatal 30529 1726882625.30668: checking for max_fail_percentage 30529 1726882625.30670: done checking for max_fail_percentage 30529 1726882625.30671: checking to see if all hosts have failed and the running result is not ok 30529 1726882625.30671: done checking to see if all hosts have failed 30529 1726882625.30672: getting the remaining hosts for this loop 30529 1726882625.30674: done getting the remaining hosts for this loop 30529 1726882625.30678: getting the next task for host managed_node1 30529 1726882625.30688: done getting next task for host managed_node1 30529 1726882625.30692: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882625.30700: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882625.30716: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d20 30529 1726882625.30719: WORKER PROCESS EXITING 30529 1726882625.30738: getting variables 30529 1726882625.30740: in VariableManager get_vars() 30529 1726882625.30774: Calling all_inventory to load vars for managed_node1 30529 1726882625.30776: Calling groups_inventory to load vars for managed_node1 30529 1726882625.30779: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882625.30790: Calling all_plugins_play to load vars for managed_node1 30529 1726882625.30923: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882625.30929: Calling groups_plugins_play to load vars for managed_node1 30529 1726882625.33570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882625.35671: done with get_vars() 30529 1726882625.35727: done getting variables 30529 1726882625.35820: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:05 -0400 (0:00:00.072) 0:00:39.384 ****** 30529 1726882625.35865: entering _queue_task() for managed_node1/service 30529 1726882625.36336: worker is 1 (out of 1 available) 30529 1726882625.36350: exiting _queue_task() for managed_node1/service 30529 1726882625.36363: done queuing things up, now waiting for results queue to drain 30529 1726882625.36365: waiting for pending results... 30529 1726882625.36676: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882625.36900: in run() - task 12673a56-9f93-b0f1-edc0-000000000d21 30529 1726882625.36905: variable 'ansible_search_path' from source: unknown 30529 1726882625.36908: variable 'ansible_search_path' from source: unknown 30529 1726882625.36911: calling self._execute() 30529 1726882625.36975: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.36979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.36990: variable 'omit' from source: magic vars 30529 1726882625.37387: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.37391: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882625.37599: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882625.37738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882625.41289: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882625.41381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882625.41436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882625.41478: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882625.41511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882625.41595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.41627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.41687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.41691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.41711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.41755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.41781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.41810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.41900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.41903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.41918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.41946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.41968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.42013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.42027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.42188: variable 'network_connections' from source: include params 30529 1726882625.42204: variable 'interface' from source: play vars 30529 1726882625.42299: variable 'interface' from source: play vars 30529 1726882625.42352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882625.42768: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882625.42808: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882625.42841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882625.42913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882625.42954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882625.43040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882625.43099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.43102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882625.43146: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882625.43407: variable 'network_connections' from source: include params 30529 1726882625.43421: variable 'interface' from source: play vars 30529 1726882625.43528: variable 'interface' from source: play vars 30529 1726882625.43531: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882625.43535: when evaluation is False, skipping this task 30529 1726882625.43543: _execute() done 30529 1726882625.43545: dumping result to json 30529 1726882625.43548: done dumping result, returning 30529 1726882625.43598: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000000d21] 30529 1726882625.43602: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d21 30529 1726882625.43785: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d21 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882625.43912: WORKER PROCESS EXITING 30529 1726882625.43925: no more pending results, returning what we have 30529 1726882625.43934: results queue empty 30529 1726882625.43935: checking for any_errors_fatal 30529 1726882625.43940: done checking for any_errors_fatal 30529 1726882625.43941: checking for max_fail_percentage 30529 1726882625.43943: done checking for max_fail_percentage 30529 1726882625.43944: checking to see if all hosts have failed and the running result is not ok 30529 1726882625.43945: done checking to see if all hosts have failed 30529 1726882625.43945: getting the remaining hosts for this loop 30529 1726882625.43947: done getting the remaining hosts for this loop 30529 1726882625.43951: getting the next task for host managed_node1 30529 1726882625.43963: done getting next task for host managed_node1 30529 1726882625.43967: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882625.43973: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882625.43995: getting variables 30529 1726882625.43997: in VariableManager get_vars() 30529 1726882625.44037: Calling all_inventory to load vars for managed_node1 30529 1726882625.44040: Calling groups_inventory to load vars for managed_node1 30529 1726882625.44043: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882625.44053: Calling all_plugins_play to load vars for managed_node1 30529 1726882625.44056: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882625.44059: Calling groups_plugins_play to load vars for managed_node1 30529 1726882625.46826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882625.48662: done with get_vars() 30529 1726882625.48866: done getting variables 30529 1726882625.49136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:05 -0400 (0:00:00.133) 0:00:39.517 ****** 30529 1726882625.49173: entering _queue_task() for managed_node1/service 30529 1726882625.49983: worker is 1 (out of 1 available) 30529 1726882625.50052: exiting _queue_task() for managed_node1/service 30529 1726882625.50065: done queuing things up, now waiting for results queue to drain 30529 1726882625.50067: waiting for pending results... 30529 1726882625.50499: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882625.50542: in run() - task 12673a56-9f93-b0f1-edc0-000000000d22 30529 1726882625.50558: variable 'ansible_search_path' from source: unknown 30529 1726882625.50562: variable 'ansible_search_path' from source: unknown 30529 1726882625.50597: calling self._execute() 30529 1726882625.50926: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.50930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.50941: variable 'omit' from source: magic vars 30529 1726882625.51582: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.51604: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882625.52029: variable 'network_provider' from source: set_fact 30529 1726882625.52033: variable 'network_state' from source: role '' defaults 30529 1726882625.52048: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882625.52059: variable 'omit' from source: magic vars 30529 1726882625.52124: variable 'omit' from source: magic vars 30529 1726882625.52154: variable 'network_service_name' from source: role '' defaults 30529 1726882625.52338: variable 'network_service_name' from source: role '' defaults 30529 1726882625.52442: variable '__network_provider_setup' from source: role '' defaults 30529 1726882625.52446: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882625.53070: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882625.53080: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882625.53262: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882625.53783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882625.58685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882625.58897: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882625.58902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882625.59068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882625.59097: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882625.59181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.59212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.59237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.59280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.59368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.59371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.59374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.59388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.59430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.59443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.59939: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882625.60242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.60264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.60358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.60361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.60366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.61138: variable 'ansible_python' from source: facts 30529 1726882625.61157: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882625.61362: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882625.61670: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882625.62100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.62104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.62191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.62334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.62413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.62467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882625.62659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882625.62684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.62823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882625.62838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882625.63499: variable 'network_connections' from source: include params 30529 1726882625.63505: variable 'interface' from source: play vars 30529 1726882625.63508: variable 'interface' from source: play vars 30529 1726882625.63699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882625.64101: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882625.64155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882625.64200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882625.64245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882625.64307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882625.64334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882625.64368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882625.64405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882625.64452: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882625.64750: variable 'network_connections' from source: include params 30529 1726882625.64756: variable 'interface' from source: play vars 30529 1726882625.64844: variable 'interface' from source: play vars 30529 1726882625.64869: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882625.64953: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882625.65261: variable 'network_connections' from source: include params 30529 1726882625.65264: variable 'interface' from source: play vars 30529 1726882625.65342: variable 'interface' from source: play vars 30529 1726882625.65364: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882625.65445: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882625.65750: variable 'network_connections' from source: include params 30529 1726882625.65753: variable 'interface' from source: play vars 30529 1726882625.65828: variable 'interface' from source: play vars 30529 1726882625.65884: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882625.65945: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882625.65952: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882625.66020: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882625.66236: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882625.66766: variable 'network_connections' from source: include params 30529 1726882625.66769: variable 'interface' from source: play vars 30529 1726882625.66901: variable 'interface' from source: play vars 30529 1726882625.66905: variable 'ansible_distribution' from source: facts 30529 1726882625.66907: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.66909: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.66919: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882625.67027: variable 'ansible_distribution' from source: facts 30529 1726882625.67030: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.67035: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.67047: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882625.67217: variable 'ansible_distribution' from source: facts 30529 1726882625.67220: variable '__network_rh_distros' from source: role '' defaults 30529 1726882625.67225: variable 'ansible_distribution_major_version' from source: facts 30529 1726882625.67257: variable 'network_provider' from source: set_fact 30529 1726882625.67278: variable 'omit' from source: magic vars 30529 1726882625.67311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882625.67336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882625.67352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882625.67368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882625.67378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882625.67412: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882625.67416: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.67418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.67514: Set connection var ansible_shell_executable to /bin/sh 30529 1726882625.67517: Set connection var ansible_pipelining to False 30529 1726882625.67519: Set connection var ansible_shell_type to sh 30529 1726882625.67529: Set connection var ansible_timeout to 10 30529 1726882625.67531: Set connection var ansible_connection to ssh 30529 1726882625.67536: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882625.67558: variable 'ansible_shell_executable' from source: unknown 30529 1726882625.67560: variable 'ansible_connection' from source: unknown 30529 1726882625.67602: variable 'ansible_module_compression' from source: unknown 30529 1726882625.67606: variable 'ansible_shell_type' from source: unknown 30529 1726882625.67609: variable 'ansible_shell_executable' from source: unknown 30529 1726882625.67612: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882625.67615: variable 'ansible_pipelining' from source: unknown 30529 1726882625.67618: variable 'ansible_timeout' from source: unknown 30529 1726882625.67620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882625.67699: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882625.67707: variable 'omit' from source: magic vars 30529 1726882625.67709: starting attempt loop 30529 1726882625.67711: running the handler 30529 1726882625.67789: variable 'ansible_facts' from source: unknown 30529 1726882625.68572: _low_level_execute_command(): starting 30529 1726882625.68580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882625.69289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882625.69292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882625.69307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882625.69324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882625.69433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882625.69436: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882625.69438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882625.69440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882625.69442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882625.69444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882625.69445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882625.69447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882625.69449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882625.69451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882625.69452: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882625.69454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882625.69497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882625.69520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882625.69597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882625.71294: stdout chunk (state=3): >>>/root <<< 30529 1726882625.71534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882625.71537: stdout chunk (state=3): >>><<< 30529 1726882625.71539: stderr chunk (state=3): >>><<< 30529 1726882625.71560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882625.71567: _low_level_execute_command(): starting 30529 1726882625.71570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871 `" && echo ansible-tmp-1726882625.7145264-32386-119402288262871="` echo /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871 `" ) && sleep 0' 30529 1726882625.72100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882625.72106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882625.72140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882625.72186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882625.74041: stdout chunk (state=3): >>>ansible-tmp-1726882625.7145264-32386-119402288262871=/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871 <<< 30529 1726882625.74185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882625.74210: stdout chunk (state=3): >>><<< 30529 1726882625.74213: stderr chunk (state=3): >>><<< 30529 1726882625.74228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882625.7145264-32386-119402288262871=/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882625.74398: variable 'ansible_module_compression' from source: unknown 30529 1726882625.74401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882625.74404: variable 'ansible_facts' from source: unknown 30529 1726882625.74572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py 30529 1726882625.74789: Sending initial data 30529 1726882625.74802: Sent initial data (156 bytes) 30529 1726882625.75289: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882625.75307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882625.75323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882625.75414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882625.75437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882625.75451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882625.75467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882625.75534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882625.77055: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882625.77086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882625.77150: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjq01ulnr /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py <<< 30529 1726882625.77160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py" <<< 30529 1726882625.77222: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjq01ulnr" to remote "/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py" <<< 30529 1726882625.79610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882625.79613: stderr chunk (state=3): >>><<< 30529 1726882625.79615: stdout chunk (state=3): >>><<< 30529 1726882625.79623: done transferring module to remote 30529 1726882625.79637: _low_level_execute_command(): starting 30529 1726882625.79660: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/ /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py && sleep 0' 30529 1726882625.80531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882625.80568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882625.80696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882625.80721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882625.80745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882625.80775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882625.80853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882625.82623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882625.82762: stdout chunk (state=3): >>><<< 30529 1726882625.82767: stderr chunk (state=3): >>><<< 30529 1726882625.82772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882625.82775: _low_level_execute_command(): starting 30529 1726882625.82777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/AnsiballZ_systemd.py && sleep 0' 30529 1726882625.83722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882625.83753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882625.83767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882625.83795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882625.83815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882625.83871: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882625.83932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882625.83947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882625.83991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882625.84087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.12652: stdout chunk (state=3): >>> <<< 30529 1726882626.12712: stdout chunk (state=3): >>>{"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10891264", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313963008", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1766133000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRec<<< 30529 1726882626.12738: stdout chunk (state=3): >>>eive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.ta<<< 30529 1726882626.12757: stdout chunk (state=3): >>>rget system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882626.14550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882626.14553: stdout chunk (state=3): >>><<< 30529 1726882626.14556: stderr chunk (state=3): >>><<< 30529 1726882626.14581: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10891264", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313963008", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1766133000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882626.14794: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882626.14819: _low_level_execute_command(): starting 30529 1726882626.14823: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882625.7145264-32386-119402288262871/ > /dev/null 2>&1 && sleep 0' 30529 1726882626.15475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.15484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882626.15570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.15603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882626.15616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.15637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.15700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.17624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882626.17627: stdout chunk (state=3): >>><<< 30529 1726882626.17630: stderr chunk (state=3): >>><<< 30529 1726882626.17667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882626.17670: handler run complete 30529 1726882626.17727: attempt loop complete, returning result 30529 1726882626.17731: _execute() done 30529 1726882626.17733: dumping result to json 30529 1726882626.17772: done dumping result, returning 30529 1726882626.17775: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000000d22] 30529 1726882626.17777: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d22 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882626.18968: no more pending results, returning what we have 30529 1726882626.18971: results queue empty 30529 1726882626.18972: checking for any_errors_fatal 30529 1726882626.18976: done checking for any_errors_fatal 30529 1726882626.18977: checking for max_fail_percentage 30529 1726882626.18978: done checking for max_fail_percentage 30529 1726882626.18979: checking to see if all hosts have failed and the running result is not ok 30529 1726882626.18980: done checking to see if all hosts have failed 30529 1726882626.18981: getting the remaining hosts for this loop 30529 1726882626.18982: done getting the remaining hosts for this loop 30529 1726882626.18985: getting the next task for host managed_node1 30529 1726882626.18991: done getting next task for host managed_node1 30529 1726882626.18996: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882626.19001: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882626.19011: getting variables 30529 1726882626.19013: in VariableManager get_vars() 30529 1726882626.19040: Calling all_inventory to load vars for managed_node1 30529 1726882626.19043: Calling groups_inventory to load vars for managed_node1 30529 1726882626.19045: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882626.19054: Calling all_plugins_play to load vars for managed_node1 30529 1726882626.19057: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882626.19059: Calling groups_plugins_play to load vars for managed_node1 30529 1726882626.19607: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d22 30529 1726882626.19610: WORKER PROCESS EXITING 30529 1726882626.20922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882626.23965: done with get_vars() 30529 1726882626.24132: done getting variables 30529 1726882626.24247: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:06 -0400 (0:00:00.751) 0:00:40.268 ****** 30529 1726882626.24286: entering _queue_task() for managed_node1/service 30529 1726882626.24623: worker is 1 (out of 1 available) 30529 1726882626.24636: exiting _queue_task() for managed_node1/service 30529 1726882626.24651: done queuing things up, now waiting for results queue to drain 30529 1726882626.24655: waiting for pending results... 30529 1726882626.24983: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882626.25169: in run() - task 12673a56-9f93-b0f1-edc0-000000000d23 30529 1726882626.25197: variable 'ansible_search_path' from source: unknown 30529 1726882626.25211: variable 'ansible_search_path' from source: unknown 30529 1726882626.25262: calling self._execute() 30529 1726882626.25374: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.25386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.25408: variable 'omit' from source: magic vars 30529 1726882626.26035: variable 'ansible_distribution_major_version' from source: facts 30529 1726882626.26047: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882626.26286: variable 'network_provider' from source: set_fact 30529 1726882626.26289: Evaluated conditional (network_provider == "nm"): True 30529 1726882626.26459: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882626.26559: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882626.26747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882626.30381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882626.30385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882626.30388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882626.30395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882626.30415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882626.30646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882626.30674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882626.30702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882626.30740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882626.30753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882626.30831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882626.30918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882626.30999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882626.31002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882626.31005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882626.31122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882626.31126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882626.31128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882626.31178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882626.31197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882626.31377: variable 'network_connections' from source: include params 30529 1726882626.31392: variable 'interface' from source: play vars 30529 1726882626.31451: variable 'interface' from source: play vars 30529 1726882626.31520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882626.31800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882626.31804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882626.31806: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882626.31819: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882626.31862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882626.31884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882626.31931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882626.31941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882626.31992: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882626.32279: variable 'network_connections' from source: include params 30529 1726882626.32282: variable 'interface' from source: play vars 30529 1726882626.32373: variable 'interface' from source: play vars 30529 1726882626.32402: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882626.32406: when evaluation is False, skipping this task 30529 1726882626.32408: _execute() done 30529 1726882626.32411: dumping result to json 30529 1726882626.32413: done dumping result, returning 30529 1726882626.32426: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000000d23] 30529 1726882626.32441: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d23 30529 1726882626.32562: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d23 30529 1726882626.32566: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882626.32617: no more pending results, returning what we have 30529 1726882626.32620: results queue empty 30529 1726882626.32621: checking for any_errors_fatal 30529 1726882626.32649: done checking for any_errors_fatal 30529 1726882626.32650: checking for max_fail_percentage 30529 1726882626.32652: done checking for max_fail_percentage 30529 1726882626.32653: checking to see if all hosts have failed and the running result is not ok 30529 1726882626.32654: done checking to see if all hosts have failed 30529 1726882626.32655: getting the remaining hosts for this loop 30529 1726882626.32657: done getting the remaining hosts for this loop 30529 1726882626.32662: getting the next task for host managed_node1 30529 1726882626.32672: done getting next task for host managed_node1 30529 1726882626.32678: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882626.32684: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882626.32708: getting variables 30529 1726882626.32710: in VariableManager get_vars() 30529 1726882626.32745: Calling all_inventory to load vars for managed_node1 30529 1726882626.32747: Calling groups_inventory to load vars for managed_node1 30529 1726882626.32749: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882626.32758: Calling all_plugins_play to load vars for managed_node1 30529 1726882626.32761: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882626.32763: Calling groups_plugins_play to load vars for managed_node1 30529 1726882626.34441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882626.36139: done with get_vars() 30529 1726882626.36166: done getting variables 30529 1726882626.36246: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:06 -0400 (0:00:00.120) 0:00:40.389 ****** 30529 1726882626.36299: entering _queue_task() for managed_node1/service 30529 1726882626.36684: worker is 1 (out of 1 available) 30529 1726882626.36799: exiting _queue_task() for managed_node1/service 30529 1726882626.36816: done queuing things up, now waiting for results queue to drain 30529 1726882626.36819: waiting for pending results... 30529 1726882626.37127: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882626.37257: in run() - task 12673a56-9f93-b0f1-edc0-000000000d24 30529 1726882626.37279: variable 'ansible_search_path' from source: unknown 30529 1726882626.37291: variable 'ansible_search_path' from source: unknown 30529 1726882626.37383: calling self._execute() 30529 1726882626.37477: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.37492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.37550: variable 'omit' from source: magic vars 30529 1726882626.38212: variable 'ansible_distribution_major_version' from source: facts 30529 1726882626.38224: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882626.38344: variable 'network_provider' from source: set_fact 30529 1726882626.38348: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882626.38350: when evaluation is False, skipping this task 30529 1726882626.38353: _execute() done 30529 1726882626.38398: dumping result to json 30529 1726882626.38402: done dumping result, returning 30529 1726882626.38405: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000000d24] 30529 1726882626.38407: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d24 30529 1726882626.38599: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d24 30529 1726882626.38602: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882626.38633: no more pending results, returning what we have 30529 1726882626.38636: results queue empty 30529 1726882626.38637: checking for any_errors_fatal 30529 1726882626.38643: done checking for any_errors_fatal 30529 1726882626.38643: checking for max_fail_percentage 30529 1726882626.38645: done checking for max_fail_percentage 30529 1726882626.38645: checking to see if all hosts have failed and the running result is not ok 30529 1726882626.38646: done checking to see if all hosts have failed 30529 1726882626.38647: getting the remaining hosts for this loop 30529 1726882626.38648: done getting the remaining hosts for this loop 30529 1726882626.38650: getting the next task for host managed_node1 30529 1726882626.38656: done getting next task for host managed_node1 30529 1726882626.38659: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882626.38664: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882626.38680: getting variables 30529 1726882626.38682: in VariableManager get_vars() 30529 1726882626.38713: Calling all_inventory to load vars for managed_node1 30529 1726882626.38716: Calling groups_inventory to load vars for managed_node1 30529 1726882626.38718: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882626.38726: Calling all_plugins_play to load vars for managed_node1 30529 1726882626.38729: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882626.38732: Calling groups_plugins_play to load vars for managed_node1 30529 1726882626.40206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882626.41742: done with get_vars() 30529 1726882626.41764: done getting variables 30529 1726882626.41824: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:06 -0400 (0:00:00.055) 0:00:40.444 ****** 30529 1726882626.41857: entering _queue_task() for managed_node1/copy 30529 1726882626.42214: worker is 1 (out of 1 available) 30529 1726882626.42227: exiting _queue_task() for managed_node1/copy 30529 1726882626.42239: done queuing things up, now waiting for results queue to drain 30529 1726882626.42241: waiting for pending results... 30529 1726882626.42544: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882626.42718: in run() - task 12673a56-9f93-b0f1-edc0-000000000d25 30529 1726882626.42722: variable 'ansible_search_path' from source: unknown 30529 1726882626.42800: variable 'ansible_search_path' from source: unknown 30529 1726882626.42805: calling self._execute() 30529 1726882626.42881: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.42898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.42914: variable 'omit' from source: magic vars 30529 1726882626.43308: variable 'ansible_distribution_major_version' from source: facts 30529 1726882626.43326: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882626.43452: variable 'network_provider' from source: set_fact 30529 1726882626.43462: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882626.43469: when evaluation is False, skipping this task 30529 1726882626.43480: _execute() done 30529 1726882626.43488: dumping result to json 30529 1726882626.43499: done dumping result, returning 30529 1726882626.43510: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000000d25] 30529 1726882626.43518: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d25 30529 1726882626.43658: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d25 30529 1726882626.43661: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882626.43740: no more pending results, returning what we have 30529 1726882626.43744: results queue empty 30529 1726882626.43745: checking for any_errors_fatal 30529 1726882626.43751: done checking for any_errors_fatal 30529 1726882626.43752: checking for max_fail_percentage 30529 1726882626.43754: done checking for max_fail_percentage 30529 1726882626.43755: checking to see if all hosts have failed and the running result is not ok 30529 1726882626.43756: done checking to see if all hosts have failed 30529 1726882626.43756: getting the remaining hosts for this loop 30529 1726882626.43758: done getting the remaining hosts for this loop 30529 1726882626.43762: getting the next task for host managed_node1 30529 1726882626.43771: done getting next task for host managed_node1 30529 1726882626.43774: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882626.43779: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882626.43805: getting variables 30529 1726882626.43807: in VariableManager get_vars() 30529 1726882626.43843: Calling all_inventory to load vars for managed_node1 30529 1726882626.43845: Calling groups_inventory to load vars for managed_node1 30529 1726882626.43848: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882626.43879: Calling all_plugins_play to load vars for managed_node1 30529 1726882626.43882: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882626.43885: Calling groups_plugins_play to load vars for managed_node1 30529 1726882626.45432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882626.46984: done with get_vars() 30529 1726882626.47008: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:06 -0400 (0:00:00.052) 0:00:40.496 ****** 30529 1726882626.47083: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882626.47367: worker is 1 (out of 1 available) 30529 1726882626.47379: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882626.47595: done queuing things up, now waiting for results queue to drain 30529 1726882626.47598: waiting for pending results... 30529 1726882626.47725: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882626.47901: in run() - task 12673a56-9f93-b0f1-edc0-000000000d26 30529 1726882626.47906: variable 'ansible_search_path' from source: unknown 30529 1726882626.47909: variable 'ansible_search_path' from source: unknown 30529 1726882626.47911: calling self._execute() 30529 1726882626.47971: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.47975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.47981: variable 'omit' from source: magic vars 30529 1726882626.48367: variable 'ansible_distribution_major_version' from source: facts 30529 1726882626.48373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882626.48380: variable 'omit' from source: magic vars 30529 1726882626.48449: variable 'omit' from source: magic vars 30529 1726882626.48733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882626.51038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882626.51109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882626.51147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882626.51181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882626.51217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882626.51302: variable 'network_provider' from source: set_fact 30529 1726882626.51495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882626.51499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882626.51502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882626.51534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882626.51549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882626.51623: variable 'omit' from source: magic vars 30529 1726882626.51733: variable 'omit' from source: magic vars 30529 1726882626.51841: variable 'network_connections' from source: include params 30529 1726882626.51857: variable 'interface' from source: play vars 30529 1726882626.52036: variable 'interface' from source: play vars 30529 1726882626.52069: variable 'omit' from source: magic vars 30529 1726882626.52077: variable '__lsr_ansible_managed' from source: task vars 30529 1726882626.52145: variable '__lsr_ansible_managed' from source: task vars 30529 1726882626.52327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882626.52547: Loaded config def from plugin (lookup/template) 30529 1726882626.52551: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882626.52580: File lookup term: get_ansible_managed.j2 30529 1726882626.52584: variable 'ansible_search_path' from source: unknown 30529 1726882626.52597: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882626.52606: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882626.52622: variable 'ansible_search_path' from source: unknown 30529 1726882626.59369: variable 'ansible_managed' from source: unknown 30529 1726882626.59523: variable 'omit' from source: magic vars 30529 1726882626.59555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882626.59585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882626.59620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882626.59645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882626.59699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882626.59703: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882626.59706: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.59712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.59819: Set connection var ansible_shell_executable to /bin/sh 30529 1726882626.59838: Set connection var ansible_pipelining to False 30529 1726882626.59848: Set connection var ansible_shell_type to sh 30529 1726882626.59899: Set connection var ansible_timeout to 10 30529 1726882626.59902: Set connection var ansible_connection to ssh 30529 1726882626.59904: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882626.59908: variable 'ansible_shell_executable' from source: unknown 30529 1726882626.59917: variable 'ansible_connection' from source: unknown 30529 1726882626.59924: variable 'ansible_module_compression' from source: unknown 30529 1726882626.59931: variable 'ansible_shell_type' from source: unknown 30529 1726882626.59938: variable 'ansible_shell_executable' from source: unknown 30529 1726882626.59955: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882626.60058: variable 'ansible_pipelining' from source: unknown 30529 1726882626.60062: variable 'ansible_timeout' from source: unknown 30529 1726882626.60066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882626.60124: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882626.60148: variable 'omit' from source: magic vars 30529 1726882626.60167: starting attempt loop 30529 1726882626.60175: running the handler 30529 1726882626.60194: _low_level_execute_command(): starting 30529 1726882626.60206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882626.60899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.60948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.60963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882626.61014: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.61076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882626.61109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.61126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.61280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.62985: stdout chunk (state=3): >>>/root <<< 30529 1726882626.63052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882626.63182: stderr chunk (state=3): >>><<< 30529 1726882626.63191: stdout chunk (state=3): >>><<< 30529 1726882626.63220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882626.63310: _low_level_execute_command(): starting 30529 1726882626.63313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296 `" && echo ansible-tmp-1726882626.632278-32428-47206128417296="` echo /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296 `" ) && sleep 0' 30529 1726882626.63815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.63831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882626.63846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882626.63865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882626.63882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882626.63896: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882626.63912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.64003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882626.64026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.64042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.64134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.66004: stdout chunk (state=3): >>>ansible-tmp-1726882626.632278-32428-47206128417296=/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296 <<< 30529 1726882626.66136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882626.66156: stdout chunk (state=3): >>><<< 30529 1726882626.66168: stderr chunk (state=3): >>><<< 30529 1726882626.66190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882626.632278-32428-47206128417296=/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882626.66244: variable 'ansible_module_compression' from source: unknown 30529 1726882626.66361: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882626.66365: variable 'ansible_facts' from source: unknown 30529 1726882626.66512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py 30529 1726882626.66720: Sending initial data 30529 1726882626.66723: Sent initial data (166 bytes) 30529 1726882626.67284: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.67301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882626.67405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.67434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882626.67448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.67466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.67538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.69068: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882626.69119: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882626.69178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3n9x_e1y /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py <<< 30529 1726882626.69182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py" <<< 30529 1726882626.69230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3n9x_e1y" to remote "/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py" <<< 30529 1726882626.70298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882626.70353: stderr chunk (state=3): >>><<< 30529 1726882626.70367: stdout chunk (state=3): >>><<< 30529 1726882626.70469: done transferring module to remote 30529 1726882626.70472: _low_level_execute_command(): starting 30529 1726882626.70475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/ /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py && sleep 0' 30529 1726882626.71098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.71133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882626.71207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882626.71247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882626.71263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.71281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.71363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.73109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882626.73120: stdout chunk (state=3): >>><<< 30529 1726882626.73136: stderr chunk (state=3): >>><<< 30529 1726882626.73227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882626.73230: _low_level_execute_command(): starting 30529 1726882626.73232: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/AnsiballZ_network_connections.py && sleep 0' 30529 1726882626.73753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882626.73769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882626.73784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882626.73808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882626.73905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882626.73933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882626.74011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882626.99001: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882627.00325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882627.00338: stdout chunk (state=3): >>><<< 30529 1726882627.00655: stderr chunk (state=3): >>><<< 30529 1726882627.00659: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882627.00662: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882627.00665: _low_level_execute_command(): starting 30529 1726882627.00667: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882626.632278-32428-47206128417296/ > /dev/null 2>&1 && sleep 0' 30529 1726882627.01607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882627.01901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882627.01926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.01944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.02015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.03862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.03872: stdout chunk (state=3): >>><<< 30529 1726882627.03883: stderr chunk (state=3): >>><<< 30529 1726882627.03906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882627.03918: handler run complete 30529 1726882627.03945: attempt loop complete, returning result 30529 1726882627.03953: _execute() done 30529 1726882627.03960: dumping result to json 30529 1726882627.03969: done dumping result, returning 30529 1726882627.03983: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000000d26] 30529 1726882627.03991: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d26 ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active 30529 1726882627.04388: no more pending results, returning what we have 30529 1726882627.04398: results queue empty 30529 1726882627.04400: checking for any_errors_fatal 30529 1726882627.04404: done checking for any_errors_fatal 30529 1726882627.04405: checking for max_fail_percentage 30529 1726882627.04407: done checking for max_fail_percentage 30529 1726882627.04408: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.04409: done checking to see if all hosts have failed 30529 1726882627.04409: getting the remaining hosts for this loop 30529 1726882627.04411: done getting the remaining hosts for this loop 30529 1726882627.04414: getting the next task for host managed_node1 30529 1726882627.04423: done getting next task for host managed_node1 30529 1726882627.04426: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882627.04430: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.04445: getting variables 30529 1726882627.04447: in VariableManager get_vars() 30529 1726882627.04481: Calling all_inventory to load vars for managed_node1 30529 1726882627.04484: Calling groups_inventory to load vars for managed_node1 30529 1726882627.04486: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.04704: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.04708: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.04712: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.05352: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d26 30529 1726882627.05358: WORKER PROCESS EXITING 30529 1726882627.07075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.08696: done with get_vars() 30529 1726882627.08721: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:07 -0400 (0:00:00.617) 0:00:41.114 ****** 30529 1726882627.08815: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882627.09182: worker is 1 (out of 1 available) 30529 1726882627.09397: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882627.09407: done queuing things up, now waiting for results queue to drain 30529 1726882627.09409: waiting for pending results... 30529 1726882627.09572: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882627.09809: in run() - task 12673a56-9f93-b0f1-edc0-000000000d27 30529 1726882627.10099: variable 'ansible_search_path' from source: unknown 30529 1726882627.10103: variable 'ansible_search_path' from source: unknown 30529 1726882627.10106: calling self._execute() 30529 1726882627.10205: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.10217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.10233: variable 'omit' from source: magic vars 30529 1726882627.10792: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.10813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.10998: variable 'network_state' from source: role '' defaults 30529 1726882627.11015: Evaluated conditional (network_state != {}): False 30529 1726882627.11028: when evaluation is False, skipping this task 30529 1726882627.11070: _execute() done 30529 1726882627.11079: dumping result to json 30529 1726882627.11086: done dumping result, returning 30529 1726882627.11122: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000000d27] 30529 1726882627.11363: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d27 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882627.11496: no more pending results, returning what we have 30529 1726882627.11501: results queue empty 30529 1726882627.11502: checking for any_errors_fatal 30529 1726882627.11518: done checking for any_errors_fatal 30529 1726882627.11519: checking for max_fail_percentage 30529 1726882627.11521: done checking for max_fail_percentage 30529 1726882627.11522: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.11523: done checking to see if all hosts have failed 30529 1726882627.11524: getting the remaining hosts for this loop 30529 1726882627.11525: done getting the remaining hosts for this loop 30529 1726882627.11530: getting the next task for host managed_node1 30529 1726882627.11539: done getting next task for host managed_node1 30529 1726882627.11544: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882627.11550: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.11574: getting variables 30529 1726882627.11576: in VariableManager get_vars() 30529 1726882627.11721: Calling all_inventory to load vars for managed_node1 30529 1726882627.11723: Calling groups_inventory to load vars for managed_node1 30529 1726882627.11726: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.11733: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d27 30529 1726882627.11736: WORKER PROCESS EXITING 30529 1726882627.11748: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.11751: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.11754: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.13191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.15053: done with get_vars() 30529 1726882627.15080: done getting variables 30529 1726882627.15143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:07 -0400 (0:00:00.063) 0:00:41.177 ****** 30529 1726882627.15186: entering _queue_task() for managed_node1/debug 30529 1726882627.15800: worker is 1 (out of 1 available) 30529 1726882627.15916: exiting _queue_task() for managed_node1/debug 30529 1726882627.15929: done queuing things up, now waiting for results queue to drain 30529 1726882627.15931: waiting for pending results... 30529 1726882627.16447: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882627.16595: in run() - task 12673a56-9f93-b0f1-edc0-000000000d28 30529 1726882627.16627: variable 'ansible_search_path' from source: unknown 30529 1726882627.16634: variable 'ansible_search_path' from source: unknown 30529 1726882627.16671: calling self._execute() 30529 1726882627.16760: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.16770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.16782: variable 'omit' from source: magic vars 30529 1726882627.17136: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.17153: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.17165: variable 'omit' from source: magic vars 30529 1726882627.17233: variable 'omit' from source: magic vars 30529 1726882627.17272: variable 'omit' from source: magic vars 30529 1726882627.17320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882627.17360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882627.17388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882627.17597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.17601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.17603: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882627.17606: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.17608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.17610: Set connection var ansible_shell_executable to /bin/sh 30529 1726882627.17612: Set connection var ansible_pipelining to False 30529 1726882627.17614: Set connection var ansible_shell_type to sh 30529 1726882627.17617: Set connection var ansible_timeout to 10 30529 1726882627.17619: Set connection var ansible_connection to ssh 30529 1726882627.17621: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882627.17635: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.17642: variable 'ansible_connection' from source: unknown 30529 1726882627.17650: variable 'ansible_module_compression' from source: unknown 30529 1726882627.17656: variable 'ansible_shell_type' from source: unknown 30529 1726882627.17663: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.17670: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.17680: variable 'ansible_pipelining' from source: unknown 30529 1726882627.17687: variable 'ansible_timeout' from source: unknown 30529 1726882627.17696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.17831: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882627.17849: variable 'omit' from source: magic vars 30529 1726882627.17860: starting attempt loop 30529 1726882627.17866: running the handler 30529 1726882627.17988: variable '__network_connections_result' from source: set_fact 30529 1726882627.18045: handler run complete 30529 1726882627.18067: attempt loop complete, returning result 30529 1726882627.18073: _execute() done 30529 1726882627.18080: dumping result to json 30529 1726882627.18087: done dumping result, returning 30529 1726882627.18102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000d28] 30529 1726882627.18111: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d28 30529 1726882627.18213: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d28 30529 1726882627.18298: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active" ] } 30529 1726882627.18568: no more pending results, returning what we have 30529 1726882627.18571: results queue empty 30529 1726882627.18572: checking for any_errors_fatal 30529 1726882627.18576: done checking for any_errors_fatal 30529 1726882627.18577: checking for max_fail_percentage 30529 1726882627.18578: done checking for max_fail_percentage 30529 1726882627.18579: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.18580: done checking to see if all hosts have failed 30529 1726882627.18580: getting the remaining hosts for this loop 30529 1726882627.18582: done getting the remaining hosts for this loop 30529 1726882627.18585: getting the next task for host managed_node1 30529 1726882627.18596: done getting next task for host managed_node1 30529 1726882627.18600: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882627.18605: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.18617: getting variables 30529 1726882627.18618: in VariableManager get_vars() 30529 1726882627.18664: Calling all_inventory to load vars for managed_node1 30529 1726882627.18667: Calling groups_inventory to load vars for managed_node1 30529 1726882627.18669: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.18678: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.18680: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.18683: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.20596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.22586: done with get_vars() 30529 1726882627.22630: done getting variables 30529 1726882627.22724: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:07 -0400 (0:00:00.076) 0:00:41.254 ****** 30529 1726882627.22817: entering _queue_task() for managed_node1/debug 30529 1726882627.23335: worker is 1 (out of 1 available) 30529 1726882627.23349: exiting _queue_task() for managed_node1/debug 30529 1726882627.23366: done queuing things up, now waiting for results queue to drain 30529 1726882627.23368: waiting for pending results... 30529 1726882627.23705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882627.23935: in run() - task 12673a56-9f93-b0f1-edc0-000000000d29 30529 1726882627.23939: variable 'ansible_search_path' from source: unknown 30529 1726882627.23942: variable 'ansible_search_path' from source: unknown 30529 1726882627.23945: calling self._execute() 30529 1726882627.23997: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.24000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.24011: variable 'omit' from source: magic vars 30529 1726882627.24397: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.24414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.24421: variable 'omit' from source: magic vars 30529 1726882627.24488: variable 'omit' from source: magic vars 30529 1726882627.24528: variable 'omit' from source: magic vars 30529 1726882627.24565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882627.24600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882627.24618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882627.24640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.24654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.24699: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882627.24702: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.24705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.24807: Set connection var ansible_shell_executable to /bin/sh 30529 1726882627.24811: Set connection var ansible_pipelining to False 30529 1726882627.24813: Set connection var ansible_shell_type to sh 30529 1726882627.24815: Set connection var ansible_timeout to 10 30529 1726882627.24818: Set connection var ansible_connection to ssh 30529 1726882627.24820: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882627.24915: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.24919: variable 'ansible_connection' from source: unknown 30529 1726882627.24921: variable 'ansible_module_compression' from source: unknown 30529 1726882627.24923: variable 'ansible_shell_type' from source: unknown 30529 1726882627.24925: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.24927: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.24929: variable 'ansible_pipelining' from source: unknown 30529 1726882627.24931: variable 'ansible_timeout' from source: unknown 30529 1726882627.24933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.24988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882627.24999: variable 'omit' from source: magic vars 30529 1726882627.25005: starting attempt loop 30529 1726882627.25008: running the handler 30529 1726882627.25059: variable '__network_connections_result' from source: set_fact 30529 1726882627.25143: variable '__network_connections_result' from source: set_fact 30529 1726882627.25250: handler run complete 30529 1726882627.25274: attempt loop complete, returning result 30529 1726882627.25285: _execute() done 30529 1726882627.25288: dumping result to json 30529 1726882627.25295: done dumping result, returning 30529 1726882627.25350: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000000d29] 30529 1726882627.25353: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d29 30529 1726882627.25420: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d29 30529 1726882627.25423: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6645673c-872c-4c3e-a9a0-f259b2189616 skipped because already active" ] } } 30529 1726882627.25517: no more pending results, returning what we have 30529 1726882627.25521: results queue empty 30529 1726882627.25522: checking for any_errors_fatal 30529 1726882627.25530: done checking for any_errors_fatal 30529 1726882627.25531: checking for max_fail_percentage 30529 1726882627.25533: done checking for max_fail_percentage 30529 1726882627.25534: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.25535: done checking to see if all hosts have failed 30529 1726882627.25536: getting the remaining hosts for this loop 30529 1726882627.25538: done getting the remaining hosts for this loop 30529 1726882627.25542: getting the next task for host managed_node1 30529 1726882627.25551: done getting next task for host managed_node1 30529 1726882627.25555: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882627.25560: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.25571: getting variables 30529 1726882627.25573: in VariableManager get_vars() 30529 1726882627.25715: Calling all_inventory to load vars for managed_node1 30529 1726882627.25718: Calling groups_inventory to load vars for managed_node1 30529 1726882627.25727: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.25738: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.25741: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.25744: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.27675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.29436: done with get_vars() 30529 1726882627.29460: done getting variables 30529 1726882627.29528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:07 -0400 (0:00:00.067) 0:00:41.321 ****** 30529 1726882627.29569: entering _queue_task() for managed_node1/debug 30529 1726882627.30124: worker is 1 (out of 1 available) 30529 1726882627.30136: exiting _queue_task() for managed_node1/debug 30529 1726882627.30149: done queuing things up, now waiting for results queue to drain 30529 1726882627.30153: waiting for pending results... 30529 1726882627.30477: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882627.30482: in run() - task 12673a56-9f93-b0f1-edc0-000000000d2a 30529 1726882627.30485: variable 'ansible_search_path' from source: unknown 30529 1726882627.30490: variable 'ansible_search_path' from source: unknown 30529 1726882627.30511: calling self._execute() 30529 1726882627.30616: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.30620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.30632: variable 'omit' from source: magic vars 30529 1726882627.31063: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.31074: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.31224: variable 'network_state' from source: role '' defaults 30529 1726882627.31227: Evaluated conditional (network_state != {}): False 30529 1726882627.31230: when evaluation is False, skipping this task 30529 1726882627.31233: _execute() done 30529 1726882627.31236: dumping result to json 30529 1726882627.31238: done dumping result, returning 30529 1726882627.31323: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000000d2a] 30529 1726882627.31326: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d2a 30529 1726882627.31390: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d2a 30529 1726882627.31396: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882627.31442: no more pending results, returning what we have 30529 1726882627.31446: results queue empty 30529 1726882627.31447: checking for any_errors_fatal 30529 1726882627.31457: done checking for any_errors_fatal 30529 1726882627.31457: checking for max_fail_percentage 30529 1726882627.31459: done checking for max_fail_percentage 30529 1726882627.31460: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.31461: done checking to see if all hosts have failed 30529 1726882627.31462: getting the remaining hosts for this loop 30529 1726882627.31463: done getting the remaining hosts for this loop 30529 1726882627.31469: getting the next task for host managed_node1 30529 1726882627.31477: done getting next task for host managed_node1 30529 1726882627.31481: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882627.31487: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.31512: getting variables 30529 1726882627.31514: in VariableManager get_vars() 30529 1726882627.31548: Calling all_inventory to load vars for managed_node1 30529 1726882627.31550: Calling groups_inventory to load vars for managed_node1 30529 1726882627.31552: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.31564: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.31567: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.31570: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.33215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.34894: done with get_vars() 30529 1726882627.34914: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:07 -0400 (0:00:00.054) 0:00:41.376 ****** 30529 1726882627.35008: entering _queue_task() for managed_node1/ping 30529 1726882627.35602: worker is 1 (out of 1 available) 30529 1726882627.35615: exiting _queue_task() for managed_node1/ping 30529 1726882627.35633: done queuing things up, now waiting for results queue to drain 30529 1726882627.35635: waiting for pending results... 30529 1726882627.35771: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882627.35936: in run() - task 12673a56-9f93-b0f1-edc0-000000000d2b 30529 1726882627.35948: variable 'ansible_search_path' from source: unknown 30529 1726882627.35951: variable 'ansible_search_path' from source: unknown 30529 1726882627.35991: calling self._execute() 30529 1726882627.36084: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.36088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.36098: variable 'omit' from source: magic vars 30529 1726882627.36457: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.36469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.36476: variable 'omit' from source: magic vars 30529 1726882627.36547: variable 'omit' from source: magic vars 30529 1726882627.36579: variable 'omit' from source: magic vars 30529 1726882627.36618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882627.36651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882627.36670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882627.36688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.36705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882627.36747: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882627.36750: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.36752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.37198: Set connection var ansible_shell_executable to /bin/sh 30529 1726882627.37201: Set connection var ansible_pipelining to False 30529 1726882627.37203: Set connection var ansible_shell_type to sh 30529 1726882627.37205: Set connection var ansible_timeout to 10 30529 1726882627.37207: Set connection var ansible_connection to ssh 30529 1726882627.37209: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882627.37211: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.37213: variable 'ansible_connection' from source: unknown 30529 1726882627.37216: variable 'ansible_module_compression' from source: unknown 30529 1726882627.37218: variable 'ansible_shell_type' from source: unknown 30529 1726882627.37220: variable 'ansible_shell_executable' from source: unknown 30529 1726882627.37222: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.37224: variable 'ansible_pipelining' from source: unknown 30529 1726882627.37226: variable 'ansible_timeout' from source: unknown 30529 1726882627.37228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.37502: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882627.37507: variable 'omit' from source: magic vars 30529 1726882627.37509: starting attempt loop 30529 1726882627.37511: running the handler 30529 1726882627.37514: _low_level_execute_command(): starting 30529 1726882627.37516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882627.38413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.38504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.38518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.38600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.40342: stdout chunk (state=3): >>>/root <<< 30529 1726882627.40425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.40428: stdout chunk (state=3): >>><<< 30529 1726882627.40430: stderr chunk (state=3): >>><<< 30529 1726882627.40436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882627.40452: _low_level_execute_command(): starting 30529 1726882627.40609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654 `" && echo ansible-tmp-1726882627.4044132-32470-79286438720654="` echo /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654 `" ) && sleep 0' 30529 1726882627.41714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882627.41730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882627.41780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882627.41951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882627.41973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.42019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.42124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.44008: stdout chunk (state=3): >>>ansible-tmp-1726882627.4044132-32470-79286438720654=/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654 <<< 30529 1726882627.44238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.44242: stdout chunk (state=3): >>><<< 30529 1726882627.44244: stderr chunk (state=3): >>><<< 30529 1726882627.44489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882627.4044132-32470-79286438720654=/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882627.44494: variable 'ansible_module_compression' from source: unknown 30529 1726882627.44497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882627.44619: variable 'ansible_facts' from source: unknown 30529 1726882627.44713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py 30529 1726882627.44859: Sending initial data 30529 1726882627.44869: Sent initial data (152 bytes) 30529 1726882627.45653: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.45718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882627.45741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.45777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.45815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.47333: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882627.47360: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882627.47372: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882627.47388: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882627.47476: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882627.47575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0i4ea_m8 /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py <<< 30529 1726882627.47589: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py" <<< 30529 1726882627.47631: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0i4ea_m8" to remote "/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py" <<< 30529 1726882627.48843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.48931: stderr chunk (state=3): >>><<< 30529 1726882627.48942: stdout chunk (state=3): >>><<< 30529 1726882627.49006: done transferring module to remote 30529 1726882627.49021: _low_level_execute_command(): starting 30529 1726882627.49074: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/ /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py && sleep 0' 30529 1726882627.50269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882627.50279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882627.50311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882627.50326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882627.50339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882627.50362: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882627.50372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.50387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882627.50525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.50627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882627.50638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.50659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.50744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.52529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.52533: stdout chunk (state=3): >>><<< 30529 1726882627.52536: stderr chunk (state=3): >>><<< 30529 1726882627.52561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882627.52565: _low_level_execute_command(): starting 30529 1726882627.52567: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/AnsiballZ_ping.py && sleep 0' 30529 1726882627.53777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.53816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882627.53842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.53953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.68827: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882627.69881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882627.69973: stderr chunk (state=3): >>><<< 30529 1726882627.70238: stdout chunk (state=3): >>><<< 30529 1726882627.70243: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882627.70247: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882627.70250: _low_level_execute_command(): starting 30529 1726882627.70252: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882627.4044132-32470-79286438720654/ > /dev/null 2>&1 && sleep 0' 30529 1726882627.71399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882627.71403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882627.71431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.71434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882627.71436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882627.71438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882627.71723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882627.71750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882627.71783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882627.73623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882627.73627: stdout chunk (state=3): >>><<< 30529 1726882627.73629: stderr chunk (state=3): >>><<< 30529 1726882627.73645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882627.73663: handler run complete 30529 1726882627.73685: attempt loop complete, returning result 30529 1726882627.73692: _execute() done 30529 1726882627.73703: dumping result to json 30529 1726882627.73711: done dumping result, returning 30529 1726882627.73801: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000000d2b] 30529 1726882627.73805: sending task result for task 12673a56-9f93-b0f1-edc0-000000000d2b 30529 1726882627.73873: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000d2b 30529 1726882627.73876: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882627.73947: no more pending results, returning what we have 30529 1726882627.73951: results queue empty 30529 1726882627.73952: checking for any_errors_fatal 30529 1726882627.73960: done checking for any_errors_fatal 30529 1726882627.73961: checking for max_fail_percentage 30529 1726882627.73962: done checking for max_fail_percentage 30529 1726882627.73963: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.73964: done checking to see if all hosts have failed 30529 1726882627.73965: getting the remaining hosts for this loop 30529 1726882627.73967: done getting the remaining hosts for this loop 30529 1726882627.73970: getting the next task for host managed_node1 30529 1726882627.73982: done getting next task for host managed_node1 30529 1726882627.73985: ^ task is: TASK: meta (role_complete) 30529 1726882627.73994: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.74005: getting variables 30529 1726882627.74007: in VariableManager get_vars() 30529 1726882627.74045: Calling all_inventory to load vars for managed_node1 30529 1726882627.74048: Calling groups_inventory to load vars for managed_node1 30529 1726882627.74050: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.74060: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.74062: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.74065: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.81160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.82822: done with get_vars() 30529 1726882627.82846: done getting variables 30529 1726882627.82917: done queuing things up, now waiting for results queue to drain 30529 1726882627.82920: results queue empty 30529 1726882627.82921: checking for any_errors_fatal 30529 1726882627.82923: done checking for any_errors_fatal 30529 1726882627.82924: checking for max_fail_percentage 30529 1726882627.82925: done checking for max_fail_percentage 30529 1726882627.82926: checking to see if all hosts have failed and the running result is not ok 30529 1726882627.82927: done checking to see if all hosts have failed 30529 1726882627.82927: getting the remaining hosts for this loop 30529 1726882627.82928: done getting the remaining hosts for this loop 30529 1726882627.82931: getting the next task for host managed_node1 30529 1726882627.82936: done getting next task for host managed_node1 30529 1726882627.82938: ^ task is: TASK: Asserts 30529 1726882627.82940: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882627.82943: getting variables 30529 1726882627.82944: in VariableManager get_vars() 30529 1726882627.82954: Calling all_inventory to load vars for managed_node1 30529 1726882627.82956: Calling groups_inventory to load vars for managed_node1 30529 1726882627.82959: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.82964: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.82966: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.82969: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.85188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882627.88713: done with get_vars() 30529 1726882627.88734: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:37:07 -0400 (0:00:00.538) 0:00:41.915 ****** 30529 1726882627.88906: entering _queue_task() for managed_node1/include_tasks 30529 1726882627.89599: worker is 1 (out of 1 available) 30529 1726882627.89613: exiting _queue_task() for managed_node1/include_tasks 30529 1726882627.89629: done queuing things up, now waiting for results queue to drain 30529 1726882627.89631: waiting for pending results... 30529 1726882627.90213: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882627.90675: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4e 30529 1726882627.90688: variable 'ansible_search_path' from source: unknown 30529 1726882627.90691: variable 'ansible_search_path' from source: unknown 30529 1726882627.90745: variable 'lsr_assert' from source: include params 30529 1726882627.91268: variable 'lsr_assert' from source: include params 30529 1726882627.91499: variable 'omit' from source: magic vars 30529 1726882627.91800: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.91804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.92009: variable 'omit' from source: magic vars 30529 1726882627.92644: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.92653: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.92661: variable 'item' from source: unknown 30529 1726882627.92733: variable 'item' from source: unknown 30529 1726882627.92768: variable 'item' from source: unknown 30529 1726882627.93117: variable 'item' from source: unknown 30529 1726882627.93511: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882627.93515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882627.93517: variable 'omit' from source: magic vars 30529 1726882627.93567: variable 'ansible_distribution_major_version' from source: facts 30529 1726882627.93570: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882627.93577: variable 'item' from source: unknown 30529 1726882627.93639: variable 'item' from source: unknown 30529 1726882627.93730: variable 'item' from source: unknown 30529 1726882627.93930: variable 'item' from source: unknown 30529 1726882627.93999: dumping result to json 30529 1726882627.94003: done dumping result, returning 30529 1726882627.94005: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-000000000a4e] 30529 1726882627.94007: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4e 30529 1726882627.94042: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4e 30529 1726882627.94044: WORKER PROCESS EXITING 30529 1726882627.94071: no more pending results, returning what we have 30529 1726882627.94076: in VariableManager get_vars() 30529 1726882627.94115: Calling all_inventory to load vars for managed_node1 30529 1726882627.94118: Calling groups_inventory to load vars for managed_node1 30529 1726882627.94122: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882627.94136: Calling all_plugins_play to load vars for managed_node1 30529 1726882627.94140: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882627.94143: Calling groups_plugins_play to load vars for managed_node1 30529 1726882627.97029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.00279: done with get_vars() 30529 1726882628.00363: variable 'ansible_search_path' from source: unknown 30529 1726882628.00364: variable 'ansible_search_path' from source: unknown 30529 1726882628.00476: variable 'ansible_search_path' from source: unknown 30529 1726882628.00478: variable 'ansible_search_path' from source: unknown 30529 1726882628.00508: we have included files to process 30529 1726882628.00509: generating all_blocks data 30529 1726882628.00512: done generating all_blocks data 30529 1726882628.00517: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882628.00518: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882628.00520: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882628.00843: in VariableManager get_vars() 30529 1726882628.00862: done with get_vars() 30529 1726882628.01088: done processing included file 30529 1726882628.01090: iterating over new_blocks loaded from include file 30529 1726882628.01092: in VariableManager get_vars() 30529 1726882628.01108: done with get_vars() 30529 1726882628.01109: filtering new block on tags 30529 1726882628.01288: done filtering new block on tags 30529 1726882628.01291: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 => (item=tasks/assert_device_present.yml) 30529 1726882628.01298: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882628.01299: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882628.01302: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30529 1726882628.01620: in VariableManager get_vars() 30529 1726882628.01641: done with get_vars() 30529 1726882628.02116: done processing included file 30529 1726882628.02118: iterating over new_blocks loaded from include file 30529 1726882628.02119: in VariableManager get_vars() 30529 1726882628.02510: done with get_vars() 30529 1726882628.02512: filtering new block on tags 30529 1726882628.02561: done filtering new block on tags 30529 1726882628.02563: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=tasks/assert_profile_present.yml) 30529 1726882628.02567: extending task lists for all hosts with included blocks 30529 1726882628.04834: done extending task lists 30529 1726882628.04835: done processing included files 30529 1726882628.04836: results queue empty 30529 1726882628.04837: checking for any_errors_fatal 30529 1726882628.04838: done checking for any_errors_fatal 30529 1726882628.04839: checking for max_fail_percentage 30529 1726882628.04840: done checking for max_fail_percentage 30529 1726882628.04841: checking to see if all hosts have failed and the running result is not ok 30529 1726882628.04842: done checking to see if all hosts have failed 30529 1726882628.04843: getting the remaining hosts for this loop 30529 1726882628.04844: done getting the remaining hosts for this loop 30529 1726882628.04847: getting the next task for host managed_node1 30529 1726882628.04852: done getting next task for host managed_node1 30529 1726882628.04854: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882628.04857: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882628.04865: getting variables 30529 1726882628.04867: in VariableManager get_vars() 30529 1726882628.04877: Calling all_inventory to load vars for managed_node1 30529 1726882628.04879: Calling groups_inventory to load vars for managed_node1 30529 1726882628.04882: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.04887: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.04894: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.05103: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.07248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.10159: done with get_vars() 30529 1726882628.10184: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.213) 0:00:42.128 ****** 30529 1726882628.10289: entering _queue_task() for managed_node1/include_tasks 30529 1726882628.10855: worker is 1 (out of 1 available) 30529 1726882628.10868: exiting _queue_task() for managed_node1/include_tasks 30529 1726882628.10881: done queuing things up, now waiting for results queue to drain 30529 1726882628.10883: waiting for pending results... 30529 1726882628.11564: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882628.11878: in run() - task 12673a56-9f93-b0f1-edc0-000000000e86 30529 1726882628.11882: variable 'ansible_search_path' from source: unknown 30529 1726882628.11885: variable 'ansible_search_path' from source: unknown 30529 1726882628.11888: calling self._execute() 30529 1726882628.12068: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.12072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.12084: variable 'omit' from source: magic vars 30529 1726882628.12963: variable 'ansible_distribution_major_version' from source: facts 30529 1726882628.12972: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882628.13198: _execute() done 30529 1726882628.13202: dumping result to json 30529 1726882628.13203: done dumping result, returning 30529 1726882628.13205: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-000000000e86] 30529 1726882628.13207: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e86 30529 1726882628.13276: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e86 30529 1726882628.13279: WORKER PROCESS EXITING 30529 1726882628.13311: no more pending results, returning what we have 30529 1726882628.13316: in VariableManager get_vars() 30529 1726882628.13354: Calling all_inventory to load vars for managed_node1 30529 1726882628.13357: Calling groups_inventory to load vars for managed_node1 30529 1726882628.13360: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.13374: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.13378: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.13380: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.15890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.19615: done with get_vars() 30529 1726882628.19640: variable 'ansible_search_path' from source: unknown 30529 1726882628.19641: variable 'ansible_search_path' from source: unknown 30529 1726882628.19654: variable 'item' from source: include params 30529 1726882628.20078: variable 'item' from source: include params 30529 1726882628.20115: we have included files to process 30529 1726882628.20306: generating all_blocks data 30529 1726882628.20309: done generating all_blocks data 30529 1726882628.20311: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882628.20312: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882628.20315: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882628.20839: done processing included file 30529 1726882628.20842: iterating over new_blocks loaded from include file 30529 1726882628.20843: in VariableManager get_vars() 30529 1726882628.20862: done with get_vars() 30529 1726882628.20864: filtering new block on tags 30529 1726882628.21168: done filtering new block on tags 30529 1726882628.21171: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882628.21176: extending task lists for all hosts with included blocks 30529 1726882628.21702: done extending task lists 30529 1726882628.21704: done processing included files 30529 1726882628.21704: results queue empty 30529 1726882628.21705: checking for any_errors_fatal 30529 1726882628.21709: done checking for any_errors_fatal 30529 1726882628.21710: checking for max_fail_percentage 30529 1726882628.21711: done checking for max_fail_percentage 30529 1726882628.21712: checking to see if all hosts have failed and the running result is not ok 30529 1726882628.21713: done checking to see if all hosts have failed 30529 1726882628.21713: getting the remaining hosts for this loop 30529 1726882628.21715: done getting the remaining hosts for this loop 30529 1726882628.21718: getting the next task for host managed_node1 30529 1726882628.21722: done getting next task for host managed_node1 30529 1726882628.21724: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882628.21728: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882628.21730: getting variables 30529 1726882628.21731: in VariableManager get_vars() 30529 1726882628.21744: Calling all_inventory to load vars for managed_node1 30529 1726882628.21747: Calling groups_inventory to load vars for managed_node1 30529 1726882628.21750: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.21758: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.21879: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.21884: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.24655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.28136: done with get_vars() 30529 1726882628.28165: done getting variables 30529 1726882628.28512: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.182) 0:00:42.311 ****** 30529 1726882628.28544: entering _queue_task() for managed_node1/stat 30529 1726882628.29246: worker is 1 (out of 1 available) 30529 1726882628.29258: exiting _queue_task() for managed_node1/stat 30529 1726882628.29271: done queuing things up, now waiting for results queue to drain 30529 1726882628.29273: waiting for pending results... 30529 1726882628.30312: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882628.30317: in run() - task 12673a56-9f93-b0f1-edc0-000000000ef5 30529 1726882628.30321: variable 'ansible_search_path' from source: unknown 30529 1726882628.30324: variable 'ansible_search_path' from source: unknown 30529 1726882628.30376: calling self._execute() 30529 1726882628.30466: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.30599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.30609: variable 'omit' from source: magic vars 30529 1726882628.31328: variable 'ansible_distribution_major_version' from source: facts 30529 1726882628.31412: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882628.31418: variable 'omit' from source: magic vars 30529 1726882628.31590: variable 'omit' from source: magic vars 30529 1726882628.31997: variable 'interface' from source: play vars 30529 1726882628.32001: variable 'omit' from source: magic vars 30529 1726882628.32004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882628.32025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882628.32047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882628.32066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882628.32080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882628.32222: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882628.32226: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.32229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.32453: Set connection var ansible_shell_executable to /bin/sh 30529 1726882628.32457: Set connection var ansible_pipelining to False 30529 1726882628.32460: Set connection var ansible_shell_type to sh 30529 1726882628.32471: Set connection var ansible_timeout to 10 30529 1726882628.32473: Set connection var ansible_connection to ssh 30529 1726882628.32478: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882628.32504: variable 'ansible_shell_executable' from source: unknown 30529 1726882628.32507: variable 'ansible_connection' from source: unknown 30529 1726882628.32510: variable 'ansible_module_compression' from source: unknown 30529 1726882628.32513: variable 'ansible_shell_type' from source: unknown 30529 1726882628.32516: variable 'ansible_shell_executable' from source: unknown 30529 1726882628.32518: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.32520: variable 'ansible_pipelining' from source: unknown 30529 1726882628.32523: variable 'ansible_timeout' from source: unknown 30529 1726882628.32526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.33203: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882628.33208: variable 'omit' from source: magic vars 30529 1726882628.33211: starting attempt loop 30529 1726882628.33214: running the handler 30529 1726882628.33217: _low_level_execute_command(): starting 30529 1726882628.33220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882628.34638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882628.34650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882628.34662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882628.34840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882628.34958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882628.35220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.35295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.36973: stdout chunk (state=3): >>>/root <<< 30529 1726882628.37060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882628.37202: stderr chunk (state=3): >>><<< 30529 1726882628.37208: stdout chunk (state=3): >>><<< 30529 1726882628.37242: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882628.37255: _low_level_execute_command(): starting 30529 1726882628.37262: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471 `" && echo ansible-tmp-1726882628.3724067-32558-157622243641471="` echo /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471 `" ) && sleep 0' 30529 1726882628.38610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882628.38801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882628.38917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.38987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.41199: stdout chunk (state=3): >>>ansible-tmp-1726882628.3724067-32558-157622243641471=/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471 <<< 30529 1726882628.41202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882628.41206: stdout chunk (state=3): >>><<< 30529 1726882628.41208: stderr chunk (state=3): >>><<< 30529 1726882628.41211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882628.3724067-32558-157622243641471=/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882628.41214: variable 'ansible_module_compression' from source: unknown 30529 1726882628.41217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882628.41398: variable 'ansible_facts' from source: unknown 30529 1726882628.41564: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py 30529 1726882628.42194: Sending initial data 30529 1726882628.42198: Sent initial data (153 bytes) 30529 1726882628.43184: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882628.43199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882628.43211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882628.43228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882628.43406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882628.43411: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882628.43485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.43551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.45089: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882628.45095: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882628.45120: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882628.45124: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882628.45225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882628.45244: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8ajzoqo4 /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py <<< 30529 1726882628.45257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py" <<< 30529 1726882628.45292: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8ajzoqo4" to remote "/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py" <<< 30529 1726882628.47150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882628.47154: stdout chunk (state=3): >>><<< 30529 1726882628.47160: stderr chunk (state=3): >>><<< 30529 1726882628.47219: done transferring module to remote 30529 1726882628.47231: _low_level_execute_command(): starting 30529 1726882628.47236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/ /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py && sleep 0' 30529 1726882628.48635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882628.48644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882628.48655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882628.48899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882628.48957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882628.48979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.49030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.50798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882628.50802: stdout chunk (state=3): >>><<< 30529 1726882628.50804: stderr chunk (state=3): >>><<< 30529 1726882628.50807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882628.50809: _low_level_execute_command(): starting 30529 1726882628.50811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/AnsiballZ_stat.py && sleep 0' 30529 1726882628.52101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882628.52105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882628.52118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882628.52209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882628.52260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882628.52308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882628.52319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.52454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.67592: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30701, "dev": 23, "nlink": 1, "atime": 1726882619.8873973, "mtime": 1726882619.8873973, "ctime": 1726882619.8873973, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882628.68658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882628.68663: stdout chunk (state=3): >>><<< 30529 1726882628.68672: stderr chunk (state=3): >>><<< 30529 1726882628.68705: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30701, "dev": 23, "nlink": 1, "atime": 1726882619.8873973, "mtime": 1726882619.8873973, "ctime": 1726882619.8873973, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882628.68759: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882628.68769: _low_level_execute_command(): starting 30529 1726882628.68774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882628.3724067-32558-157622243641471/ > /dev/null 2>&1 && sleep 0' 30529 1726882628.70199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882628.70217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882628.70236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882628.70256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882628.70275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882628.70288: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882628.70389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882628.70630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882628.70668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882628.72489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882628.72504: stdout chunk (state=3): >>><<< 30529 1726882628.72517: stderr chunk (state=3): >>><<< 30529 1726882628.72538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882628.72551: handler run complete 30529 1726882628.72602: attempt loop complete, returning result 30529 1726882628.72666: _execute() done 30529 1726882628.72675: dumping result to json 30529 1726882628.72685: done dumping result, returning 30529 1726882628.72702: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000000ef5] 30529 1726882628.72710: sending task result for task 12673a56-9f93-b0f1-edc0-000000000ef5 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882619.8873973, "block_size": 4096, "blocks": 0, "ctime": 1726882619.8873973, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30701, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882619.8873973, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30529 1726882628.72930: no more pending results, returning what we have 30529 1726882628.72934: results queue empty 30529 1726882628.72935: checking for any_errors_fatal 30529 1726882628.72936: done checking for any_errors_fatal 30529 1726882628.72937: checking for max_fail_percentage 30529 1726882628.72939: done checking for max_fail_percentage 30529 1726882628.72940: checking to see if all hosts have failed and the running result is not ok 30529 1726882628.72941: done checking to see if all hosts have failed 30529 1726882628.72941: getting the remaining hosts for this loop 30529 1726882628.72943: done getting the remaining hosts for this loop 30529 1726882628.72947: getting the next task for host managed_node1 30529 1726882628.72957: done getting next task for host managed_node1 30529 1726882628.72959: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30529 1726882628.72963: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882628.72967: getting variables 30529 1726882628.72969: in VariableManager get_vars() 30529 1726882628.73312: Calling all_inventory to load vars for managed_node1 30529 1726882628.73315: Calling groups_inventory to load vars for managed_node1 30529 1726882628.73319: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.73332: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.73336: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.73340: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.73901: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000ef5 30529 1726882628.73904: WORKER PROCESS EXITING 30529 1726882628.77767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.80767: done with get_vars() 30529 1726882628.80803: done getting variables 30529 1726882628.80858: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882628.81185: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:37:08 -0400 (0:00:00.528) 0:00:42.840 ****** 30529 1726882628.81423: entering _queue_task() for managed_node1/assert 30529 1726882628.82828: worker is 1 (out of 1 available) 30529 1726882628.82840: exiting _queue_task() for managed_node1/assert 30529 1726882628.82852: done queuing things up, now waiting for results queue to drain 30529 1726882628.82853: waiting for pending results... 30529 1726882628.83614: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' 30529 1726882628.84099: in run() - task 12673a56-9f93-b0f1-edc0-000000000e87 30529 1726882628.84103: variable 'ansible_search_path' from source: unknown 30529 1726882628.84106: variable 'ansible_search_path' from source: unknown 30529 1726882628.84108: calling self._execute() 30529 1726882628.84111: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.84113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.84459: variable 'omit' from source: magic vars 30529 1726882628.85395: variable 'ansible_distribution_major_version' from source: facts 30529 1726882628.85414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882628.85503: variable 'omit' from source: magic vars 30529 1726882628.85556: variable 'omit' from source: magic vars 30529 1726882628.85769: variable 'interface' from source: play vars 30529 1726882628.85840: variable 'omit' from source: magic vars 30529 1726882628.85888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882628.85968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882628.86061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882628.86114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882628.86162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882628.86223: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882628.86324: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.86333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.86460: Set connection var ansible_shell_executable to /bin/sh 30529 1726882628.86524: Set connection var ansible_pipelining to False 30529 1726882628.86563: Set connection var ansible_shell_type to sh 30529 1726882628.86585: Set connection var ansible_timeout to 10 30529 1726882628.86596: Set connection var ansible_connection to ssh 30529 1726882628.86608: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882628.86635: variable 'ansible_shell_executable' from source: unknown 30529 1726882628.86644: variable 'ansible_connection' from source: unknown 30529 1726882628.86651: variable 'ansible_module_compression' from source: unknown 30529 1726882628.86657: variable 'ansible_shell_type' from source: unknown 30529 1726882628.86664: variable 'ansible_shell_executable' from source: unknown 30529 1726882628.86670: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.86677: variable 'ansible_pipelining' from source: unknown 30529 1726882628.86691: variable 'ansible_timeout' from source: unknown 30529 1726882628.86702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.86860: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882628.86879: variable 'omit' from source: magic vars 30529 1726882628.86889: starting attempt loop 30529 1726882628.86995: running the handler 30529 1726882628.87049: variable 'interface_stat' from source: set_fact 30529 1726882628.87074: Evaluated conditional (interface_stat.stat.exists): True 30529 1726882628.87084: handler run complete 30529 1726882628.87109: attempt loop complete, returning result 30529 1726882628.87126: _execute() done 30529 1726882628.87199: dumping result to json 30529 1726882628.87202: done dumping result, returning 30529 1726882628.87205: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' [12673a56-9f93-b0f1-edc0-000000000e87] 30529 1726882628.87207: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e87 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882628.87390: no more pending results, returning what we have 30529 1726882628.87395: results queue empty 30529 1726882628.87397: checking for any_errors_fatal 30529 1726882628.87411: done checking for any_errors_fatal 30529 1726882628.87413: checking for max_fail_percentage 30529 1726882628.87415: done checking for max_fail_percentage 30529 1726882628.87416: checking to see if all hosts have failed and the running result is not ok 30529 1726882628.87417: done checking to see if all hosts have failed 30529 1726882628.87417: getting the remaining hosts for this loop 30529 1726882628.87419: done getting the remaining hosts for this loop 30529 1726882628.87424: getting the next task for host managed_node1 30529 1726882628.87436: done getting next task for host managed_node1 30529 1726882628.87440: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882628.87451: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882628.87457: getting variables 30529 1726882628.87460: in VariableManager get_vars() 30529 1726882628.87497: Calling all_inventory to load vars for managed_node1 30529 1726882628.87500: Calling groups_inventory to load vars for managed_node1 30529 1726882628.87505: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.87562: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e87 30529 1726882628.87565: WORKER PROCESS EXITING 30529 1726882628.87578: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.87583: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.87587: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.90324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882628.91876: done with get_vars() 30529 1726882628.91900: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:37:08 -0400 (0:00:00.108) 0:00:42.949 ****** 30529 1726882628.92315: entering _queue_task() for managed_node1/include_tasks 30529 1726882628.93052: worker is 1 (out of 1 available) 30529 1726882628.93065: exiting _queue_task() for managed_node1/include_tasks 30529 1726882628.93078: done queuing things up, now waiting for results queue to drain 30529 1726882628.93080: waiting for pending results... 30529 1726882628.93591: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882628.93894: in run() - task 12673a56-9f93-b0f1-edc0-000000000e8b 30529 1726882628.93919: variable 'ansible_search_path' from source: unknown 30529 1726882628.94204: variable 'ansible_search_path' from source: unknown 30529 1726882628.94209: calling self._execute() 30529 1726882628.94241: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882628.94253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882628.94269: variable 'omit' from source: magic vars 30529 1726882628.95059: variable 'ansible_distribution_major_version' from source: facts 30529 1726882628.95086: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882628.95117: _execute() done 30529 1726882628.95218: dumping result to json 30529 1726882628.95222: done dumping result, returning 30529 1726882628.95226: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-000000000e8b] 30529 1726882628.95229: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8b 30529 1726882628.95468: no more pending results, returning what we have 30529 1726882628.95474: in VariableManager get_vars() 30529 1726882628.95517: Calling all_inventory to load vars for managed_node1 30529 1726882628.95520: Calling groups_inventory to load vars for managed_node1 30529 1726882628.95524: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882628.95539: Calling all_plugins_play to load vars for managed_node1 30529 1726882628.95543: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882628.95547: Calling groups_plugins_play to load vars for managed_node1 30529 1726882628.96411: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8b 30529 1726882628.96415: WORKER PROCESS EXITING 30529 1726882628.98690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882629.02615: done with get_vars() 30529 1726882629.02706: variable 'ansible_search_path' from source: unknown 30529 1726882629.02708: variable 'ansible_search_path' from source: unknown 30529 1726882629.02718: variable 'item' from source: include params 30529 1726882629.02889: variable 'item' from source: include params 30529 1726882629.02992: we have included files to process 30529 1726882629.02995: generating all_blocks data 30529 1726882629.02997: done generating all_blocks data 30529 1726882629.03001: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882629.03002: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882629.03004: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882629.05078: done processing included file 30529 1726882629.05080: iterating over new_blocks loaded from include file 30529 1726882629.05081: in VariableManager get_vars() 30529 1726882629.05103: done with get_vars() 30529 1726882629.05105: filtering new block on tags 30529 1726882629.05310: done filtering new block on tags 30529 1726882629.05313: in VariableManager get_vars() 30529 1726882629.05328: done with get_vars() 30529 1726882629.05330: filtering new block on tags 30529 1726882629.05503: done filtering new block on tags 30529 1726882629.05506: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882629.05511: extending task lists for all hosts with included blocks 30529 1726882629.06200: done extending task lists 30529 1726882629.06201: done processing included files 30529 1726882629.06202: results queue empty 30529 1726882629.06203: checking for any_errors_fatal 30529 1726882629.06206: done checking for any_errors_fatal 30529 1726882629.06207: checking for max_fail_percentage 30529 1726882629.06208: done checking for max_fail_percentage 30529 1726882629.06209: checking to see if all hosts have failed and the running result is not ok 30529 1726882629.06210: done checking to see if all hosts have failed 30529 1726882629.06211: getting the remaining hosts for this loop 30529 1726882629.06212: done getting the remaining hosts for this loop 30529 1726882629.06215: getting the next task for host managed_node1 30529 1726882629.06300: done getting next task for host managed_node1 30529 1726882629.06303: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882629.06306: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882629.06308: getting variables 30529 1726882629.06309: in VariableManager get_vars() 30529 1726882629.06317: Calling all_inventory to load vars for managed_node1 30529 1726882629.06319: Calling groups_inventory to load vars for managed_node1 30529 1726882629.06322: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882629.06369: Calling all_plugins_play to load vars for managed_node1 30529 1726882629.06373: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882629.06377: Calling groups_plugins_play to load vars for managed_node1 30529 1726882629.08880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882629.12215: done with get_vars() 30529 1726882629.12244: done getting variables 30529 1726882629.12353: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:37:09 -0400 (0:00:00.201) 0:00:43.150 ****** 30529 1726882629.12436: entering _queue_task() for managed_node1/set_fact 30529 1726882629.13297: worker is 1 (out of 1 available) 30529 1726882629.13310: exiting _queue_task() for managed_node1/set_fact 30529 1726882629.13322: done queuing things up, now waiting for results queue to drain 30529 1726882629.13324: waiting for pending results... 30529 1726882629.13866: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882629.14205: in run() - task 12673a56-9f93-b0f1-edc0-000000000f13 30529 1726882629.14212: variable 'ansible_search_path' from source: unknown 30529 1726882629.14215: variable 'ansible_search_path' from source: unknown 30529 1726882629.14253: calling self._execute() 30529 1726882629.14639: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.14643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.14646: variable 'omit' from source: magic vars 30529 1726882629.15334: variable 'ansible_distribution_major_version' from source: facts 30529 1726882629.15352: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882629.15365: variable 'omit' from source: magic vars 30529 1726882629.15511: variable 'omit' from source: magic vars 30529 1726882629.15554: variable 'omit' from source: magic vars 30529 1726882629.15672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882629.15715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882629.15956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882629.15960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.15962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.15964: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882629.15966: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.15968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.16283: Set connection var ansible_shell_executable to /bin/sh 30529 1726882629.16392: Set connection var ansible_pipelining to False 30529 1726882629.16397: Set connection var ansible_shell_type to sh 30529 1726882629.16399: Set connection var ansible_timeout to 10 30529 1726882629.16401: Set connection var ansible_connection to ssh 30529 1726882629.16403: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882629.16405: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.16407: variable 'ansible_connection' from source: unknown 30529 1726882629.16409: variable 'ansible_module_compression' from source: unknown 30529 1726882629.16411: variable 'ansible_shell_type' from source: unknown 30529 1726882629.16413: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.16415: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.16417: variable 'ansible_pipelining' from source: unknown 30529 1726882629.16419: variable 'ansible_timeout' from source: unknown 30529 1726882629.16421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.16643: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882629.16660: variable 'omit' from source: magic vars 30529 1726882629.16670: starting attempt loop 30529 1726882629.16677: running the handler 30529 1726882629.16696: handler run complete 30529 1726882629.16728: attempt loop complete, returning result 30529 1726882629.16803: _execute() done 30529 1726882629.16811: dumping result to json 30529 1726882629.16824: done dumping result, returning 30529 1726882629.16837: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-000000000f13] 30529 1726882629.16847: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f13 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882629.16998: no more pending results, returning what we have 30529 1726882629.17003: results queue empty 30529 1726882629.17004: checking for any_errors_fatal 30529 1726882629.17006: done checking for any_errors_fatal 30529 1726882629.17007: checking for max_fail_percentage 30529 1726882629.17008: done checking for max_fail_percentage 30529 1726882629.17009: checking to see if all hosts have failed and the running result is not ok 30529 1726882629.17010: done checking to see if all hosts have failed 30529 1726882629.17011: getting the remaining hosts for this loop 30529 1726882629.17013: done getting the remaining hosts for this loop 30529 1726882629.17017: getting the next task for host managed_node1 30529 1726882629.17028: done getting next task for host managed_node1 30529 1726882629.17030: ^ task is: TASK: Stat profile file 30529 1726882629.17037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882629.17041: getting variables 30529 1726882629.17043: in VariableManager get_vars() 30529 1726882629.17078: Calling all_inventory to load vars for managed_node1 30529 1726882629.17081: Calling groups_inventory to load vars for managed_node1 30529 1726882629.17085: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882629.17099: Calling all_plugins_play to load vars for managed_node1 30529 1726882629.17103: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882629.17106: Calling groups_plugins_play to load vars for managed_node1 30529 1726882629.18100: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f13 30529 1726882629.18104: WORKER PROCESS EXITING 30529 1726882629.20265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882629.23604: done with get_vars() 30529 1726882629.23636: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:37:09 -0400 (0:00:00.112) 0:00:43.263 ****** 30529 1726882629.23734: entering _queue_task() for managed_node1/stat 30529 1726882629.24497: worker is 1 (out of 1 available) 30529 1726882629.24511: exiting _queue_task() for managed_node1/stat 30529 1726882629.24523: done queuing things up, now waiting for results queue to drain 30529 1726882629.24525: waiting for pending results... 30529 1726882629.25005: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882629.25272: in run() - task 12673a56-9f93-b0f1-edc0-000000000f14 30529 1726882629.25292: variable 'ansible_search_path' from source: unknown 30529 1726882629.25345: variable 'ansible_search_path' from source: unknown 30529 1726882629.25385: calling self._execute() 30529 1726882629.25558: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.25775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.25778: variable 'omit' from source: magic vars 30529 1726882629.26374: variable 'ansible_distribution_major_version' from source: facts 30529 1726882629.26395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882629.26434: variable 'omit' from source: magic vars 30529 1726882629.26494: variable 'omit' from source: magic vars 30529 1726882629.26733: variable 'profile' from source: play vars 30529 1726882629.26746: variable 'interface' from source: play vars 30529 1726882629.26970: variable 'interface' from source: play vars 30529 1726882629.26973: variable 'omit' from source: magic vars 30529 1726882629.27300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882629.27304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882629.27306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882629.27309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.27311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.27408: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882629.27412: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.27414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.27464: Set connection var ansible_shell_executable to /bin/sh 30529 1726882629.27524: Set connection var ansible_pipelining to False 30529 1726882629.27532: Set connection var ansible_shell_type to sh 30529 1726882629.27545: Set connection var ansible_timeout to 10 30529 1726882629.27552: Set connection var ansible_connection to ssh 30529 1726882629.27633: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882629.27659: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.27668: variable 'ansible_connection' from source: unknown 30529 1726882629.27675: variable 'ansible_module_compression' from source: unknown 30529 1726882629.27681: variable 'ansible_shell_type' from source: unknown 30529 1726882629.27687: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.27695: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.27705: variable 'ansible_pipelining' from source: unknown 30529 1726882629.27713: variable 'ansible_timeout' from source: unknown 30529 1726882629.27722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.28169: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882629.28175: variable 'omit' from source: magic vars 30529 1726882629.28186: starting attempt loop 30529 1726882629.28192: running the handler 30529 1726882629.28213: _low_level_execute_command(): starting 30529 1726882629.28288: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882629.29637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882629.29649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882629.29664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882629.29683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882629.29892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.29921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.30039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.31707: stdout chunk (state=3): >>>/root <<< 30529 1726882629.31877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.31880: stdout chunk (state=3): >>><<< 30529 1726882629.31882: stderr chunk (state=3): >>><<< 30529 1726882629.32071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882629.32078: _low_level_execute_command(): starting 30529 1726882629.32081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286 `" && echo ansible-tmp-1726882629.3198178-32595-30740752696286="` echo /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286 `" ) && sleep 0' 30529 1726882629.33099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882629.33116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882629.33160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882629.33178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882629.33268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882629.33392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.33418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.33541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.35404: stdout chunk (state=3): >>>ansible-tmp-1726882629.3198178-32595-30740752696286=/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286 <<< 30529 1726882629.35504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.35662: stderr chunk (state=3): >>><<< 30529 1726882629.35665: stdout chunk (state=3): >>><<< 30529 1726882629.35682: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882629.3198178-32595-30740752696286=/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882629.35734: variable 'ansible_module_compression' from source: unknown 30529 1726882629.35970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882629.35974: variable 'ansible_facts' from source: unknown 30529 1726882629.36053: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py 30529 1726882629.36531: Sending initial data 30529 1726882629.36534: Sent initial data (152 bytes) 30529 1726882629.37712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882629.37750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.37786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.39299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882629.39327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882629.39389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpyozb_s5i /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py <<< 30529 1726882629.39401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py" <<< 30529 1726882629.39470: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpyozb_s5i" to remote "/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py" <<< 30529 1726882629.40804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.40814: stdout chunk (state=3): >>><<< 30529 1726882629.40828: stderr chunk (state=3): >>><<< 30529 1726882629.40868: done transferring module to remote 30529 1726882629.41080: _low_level_execute_command(): starting 30529 1726882629.41084: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/ /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py && sleep 0' 30529 1726882629.42001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882629.42054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882629.42069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882629.42206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882629.42382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.42409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.42479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.44211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.44222: stdout chunk (state=3): >>><<< 30529 1726882629.44232: stderr chunk (state=3): >>><<< 30529 1726882629.44399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882629.44403: _low_level_execute_command(): starting 30529 1726882629.44406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/AnsiballZ_stat.py && sleep 0' 30529 1726882629.45511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882629.45623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882629.45640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.45663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.45784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.60929: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882629.62112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882629.62126: stdout chunk (state=3): >>><<< 30529 1726882629.62146: stderr chunk (state=3): >>><<< 30529 1726882629.62169: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882629.62373: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882629.62377: _low_level_execute_command(): starting 30529 1726882629.62379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882629.3198178-32595-30740752696286/ > /dev/null 2>&1 && sleep 0' 30529 1726882629.63788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882629.63992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.64267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.64302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.66249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.66257: stdout chunk (state=3): >>><<< 30529 1726882629.66260: stderr chunk (state=3): >>><<< 30529 1726882629.66303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882629.66307: handler run complete 30529 1726882629.66320: attempt loop complete, returning result 30529 1726882629.66323: _execute() done 30529 1726882629.66325: dumping result to json 30529 1726882629.66328: done dumping result, returning 30529 1726882629.66337: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-000000000f14] 30529 1726882629.66342: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f14 30529 1726882629.66576: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f14 30529 1726882629.66579: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882629.66675: no more pending results, returning what we have 30529 1726882629.66679: results queue empty 30529 1726882629.66680: checking for any_errors_fatal 30529 1726882629.66689: done checking for any_errors_fatal 30529 1726882629.66689: checking for max_fail_percentage 30529 1726882629.66691: done checking for max_fail_percentage 30529 1726882629.66692: checking to see if all hosts have failed and the running result is not ok 30529 1726882629.66694: done checking to see if all hosts have failed 30529 1726882629.66695: getting the remaining hosts for this loop 30529 1726882629.66697: done getting the remaining hosts for this loop 30529 1726882629.66702: getting the next task for host managed_node1 30529 1726882629.66710: done getting next task for host managed_node1 30529 1726882629.66713: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882629.66719: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882629.66723: getting variables 30529 1726882629.66725: in VariableManager get_vars() 30529 1726882629.66762: Calling all_inventory to load vars for managed_node1 30529 1726882629.66765: Calling groups_inventory to load vars for managed_node1 30529 1726882629.66769: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882629.66781: Calling all_plugins_play to load vars for managed_node1 30529 1726882629.66785: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882629.66788: Calling groups_plugins_play to load vars for managed_node1 30529 1726882629.71884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882629.76923: done with get_vars() 30529 1726882629.77079: done getting variables 30529 1726882629.77151: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:37:09 -0400 (0:00:00.535) 0:00:43.799 ****** 30529 1726882629.77309: entering _queue_task() for managed_node1/set_fact 30529 1726882629.78260: worker is 1 (out of 1 available) 30529 1726882629.78401: exiting _queue_task() for managed_node1/set_fact 30529 1726882629.78420: done queuing things up, now waiting for results queue to drain 30529 1726882629.78422: waiting for pending results... 30529 1726882629.79459: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882629.79786: in run() - task 12673a56-9f93-b0f1-edc0-000000000f15 30529 1726882629.79894: variable 'ansible_search_path' from source: unknown 30529 1726882629.79899: variable 'ansible_search_path' from source: unknown 30529 1726882629.79946: calling self._execute() 30529 1726882629.80142: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.80145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.80165: variable 'omit' from source: magic vars 30529 1726882629.80939: variable 'ansible_distribution_major_version' from source: facts 30529 1726882629.80951: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882629.81308: variable 'profile_stat' from source: set_fact 30529 1726882629.81341: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882629.81344: when evaluation is False, skipping this task 30529 1726882629.81346: _execute() done 30529 1726882629.81349: dumping result to json 30529 1726882629.81356: done dumping result, returning 30529 1726882629.81359: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-000000000f15] 30529 1726882629.81361: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f15 30529 1726882629.81437: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f15 30529 1726882629.81440: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882629.81500: no more pending results, returning what we have 30529 1726882629.81505: results queue empty 30529 1726882629.81506: checking for any_errors_fatal 30529 1726882629.81517: done checking for any_errors_fatal 30529 1726882629.81517: checking for max_fail_percentage 30529 1726882629.81519: done checking for max_fail_percentage 30529 1726882629.81520: checking to see if all hosts have failed and the running result is not ok 30529 1726882629.81521: done checking to see if all hosts have failed 30529 1726882629.81521: getting the remaining hosts for this loop 30529 1726882629.81523: done getting the remaining hosts for this loop 30529 1726882629.81527: getting the next task for host managed_node1 30529 1726882629.81535: done getting next task for host managed_node1 30529 1726882629.81538: ^ task is: TASK: Get NM profile info 30529 1726882629.81543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882629.81546: getting variables 30529 1726882629.81549: in VariableManager get_vars() 30529 1726882629.81584: Calling all_inventory to load vars for managed_node1 30529 1726882629.81586: Calling groups_inventory to load vars for managed_node1 30529 1726882629.81592: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882629.81608: Calling all_plugins_play to load vars for managed_node1 30529 1726882629.81612: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882629.81615: Calling groups_plugins_play to load vars for managed_node1 30529 1726882629.85307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882629.89012: done with get_vars() 30529 1726882629.89156: done getting variables 30529 1726882629.89248: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:37:09 -0400 (0:00:00.120) 0:00:43.919 ****** 30529 1726882629.89324: entering _queue_task() for managed_node1/shell 30529 1726882629.90226: worker is 1 (out of 1 available) 30529 1726882629.90297: exiting _queue_task() for managed_node1/shell 30529 1726882629.90315: done queuing things up, now waiting for results queue to drain 30529 1726882629.90318: waiting for pending results... 30529 1726882629.91488: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882629.91498: in run() - task 12673a56-9f93-b0f1-edc0-000000000f16 30529 1726882629.91504: variable 'ansible_search_path' from source: unknown 30529 1726882629.91507: variable 'ansible_search_path' from source: unknown 30529 1726882629.91618: calling self._execute() 30529 1726882629.91919: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.91923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.91927: variable 'omit' from source: magic vars 30529 1726882629.93539: variable 'ansible_distribution_major_version' from source: facts 30529 1726882629.93571: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882629.93677: variable 'omit' from source: magic vars 30529 1726882629.93868: variable 'omit' from source: magic vars 30529 1726882629.94101: variable 'profile' from source: play vars 30529 1726882629.94113: variable 'interface' from source: play vars 30529 1726882629.94264: variable 'interface' from source: play vars 30529 1726882629.94291: variable 'omit' from source: magic vars 30529 1726882629.94360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882629.94581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882629.94610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882629.94704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.94772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882629.94876: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882629.94879: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.94881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.95105: Set connection var ansible_shell_executable to /bin/sh 30529 1726882629.95117: Set connection var ansible_pipelining to False 30529 1726882629.95124: Set connection var ansible_shell_type to sh 30529 1726882629.95137: Set connection var ansible_timeout to 10 30529 1726882629.95144: Set connection var ansible_connection to ssh 30529 1726882629.95155: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882629.95279: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.95288: variable 'ansible_connection' from source: unknown 30529 1726882629.95329: variable 'ansible_module_compression' from source: unknown 30529 1726882629.95405: variable 'ansible_shell_type' from source: unknown 30529 1726882629.95410: variable 'ansible_shell_executable' from source: unknown 30529 1726882629.95413: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882629.95416: variable 'ansible_pipelining' from source: unknown 30529 1726882629.95419: variable 'ansible_timeout' from source: unknown 30529 1726882629.95422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882629.95594: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882629.95615: variable 'omit' from source: magic vars 30529 1726882629.95661: starting attempt loop 30529 1726882629.95664: running the handler 30529 1726882629.95668: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882629.95685: _low_level_execute_command(): starting 30529 1726882629.95725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882629.96837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882629.96851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882629.96867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882629.96885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882629.96907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882629.96939: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882629.97031: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882629.97102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882629.97324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882629.98930: stdout chunk (state=3): >>>/root <<< 30529 1726882629.99060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882629.99337: stderr chunk (state=3): >>><<< 30529 1726882629.99340: stdout chunk (state=3): >>><<< 30529 1726882629.99344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882629.99346: _low_level_execute_command(): starting 30529 1726882629.99350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134 `" && echo ansible-tmp-1726882629.9927368-32635-280661059408134="` echo /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134 `" ) && sleep 0' 30529 1726882630.00664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882630.00679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.00735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882630.00803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882630.00875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882630.02714: stdout chunk (state=3): >>>ansible-tmp-1726882629.9927368-32635-280661059408134=/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134 <<< 30529 1726882630.02903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882630.02906: stdout chunk (state=3): >>><<< 30529 1726882630.02909: stderr chunk (state=3): >>><<< 30529 1726882630.02912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882629.9927368-32635-280661059408134=/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882630.02929: variable 'ansible_module_compression' from source: unknown 30529 1726882630.02980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882630.03025: variable 'ansible_facts' from source: unknown 30529 1726882630.03110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py 30529 1726882630.03337: Sending initial data 30529 1726882630.03341: Sent initial data (156 bytes) 30529 1726882630.04018: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882630.04033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882630.04052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882630.04127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882630.05635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882630.05672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882630.05718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0n5pu18b /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py <<< 30529 1726882630.05721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py" <<< 30529 1726882630.05760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0n5pu18b" to remote "/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py" <<< 30529 1726882630.05764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py" <<< 30529 1726882630.06287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882630.06337: stderr chunk (state=3): >>><<< 30529 1726882630.06340: stdout chunk (state=3): >>><<< 30529 1726882630.06389: done transferring module to remote 30529 1726882630.06399: _low_level_execute_command(): starting 30529 1726882630.06406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/ /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py && sleep 0' 30529 1726882630.07011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882630.07042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.07150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882630.07162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882630.07231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882630.08934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882630.08962: stderr chunk (state=3): >>><<< 30529 1726882630.08965: stdout chunk (state=3): >>><<< 30529 1726882630.08980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882630.08983: _low_level_execute_command(): starting 30529 1726882630.08986: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/AnsiballZ_command.py && sleep 0' 30529 1726882630.09480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882630.09484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.09489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882630.09496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.09532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882630.09583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882630.26118: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:37:10.243173", "end": "2024-09-20 21:37:10.259843", "delta": "0:00:00.016670", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882630.27580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882630.27585: stdout chunk (state=3): >>><<< 30529 1726882630.27587: stderr chunk (state=3): >>><<< 30529 1726882630.27608: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:37:10.243173", "end": "2024-09-20 21:37:10.259843", "delta": "0:00:00.016670", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882630.27646: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882630.27652: _low_level_execute_command(): starting 30529 1726882630.27657: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882629.9927368-32635-280661059408134/ > /dev/null 2>&1 && sleep 0' 30529 1726882630.28062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882630.28096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.28105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882630.28108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882630.28150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882630.28153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882630.28205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882630.30005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882630.30021: stderr chunk (state=3): >>><<< 30529 1726882630.30026: stdout chunk (state=3): >>><<< 30529 1726882630.30042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882630.30052: handler run complete 30529 1726882630.30071: Evaluated conditional (False): False 30529 1726882630.30081: attempt loop complete, returning result 30529 1726882630.30083: _execute() done 30529 1726882630.30086: dumping result to json 30529 1726882630.30095: done dumping result, returning 30529 1726882630.30104: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-000000000f16] 30529 1726882630.30106: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f16 ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016670", "end": "2024-09-20 21:37:10.259843", "rc": 0, "start": "2024-09-20 21:37:10.243173" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30529 1726882630.30276: no more pending results, returning what we have 30529 1726882630.30280: results queue empty 30529 1726882630.30281: checking for any_errors_fatal 30529 1726882630.30288: done checking for any_errors_fatal 30529 1726882630.30289: checking for max_fail_percentage 30529 1726882630.30290: done checking for max_fail_percentage 30529 1726882630.30291: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.30294: done checking to see if all hosts have failed 30529 1726882630.30295: getting the remaining hosts for this loop 30529 1726882630.30297: done getting the remaining hosts for this loop 30529 1726882630.30301: getting the next task for host managed_node1 30529 1726882630.30310: done getting next task for host managed_node1 30529 1726882630.30313: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882630.30320: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.30324: getting variables 30529 1726882630.30326: in VariableManager get_vars() 30529 1726882630.30360: Calling all_inventory to load vars for managed_node1 30529 1726882630.30362: Calling groups_inventory to load vars for managed_node1 30529 1726882630.30365: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.30376: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.30379: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.30382: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.31224: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f16 30529 1726882630.31228: WORKER PROCESS EXITING 30529 1726882630.31238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.32547: done with get_vars() 30529 1726882630.32592: done getting variables 30529 1726882630.32725: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:37:10 -0400 (0:00:00.434) 0:00:44.353 ****** 30529 1726882630.32762: entering _queue_task() for managed_node1/set_fact 30529 1726882630.33170: worker is 1 (out of 1 available) 30529 1726882630.33184: exiting _queue_task() for managed_node1/set_fact 30529 1726882630.33203: done queuing things up, now waiting for results queue to drain 30529 1726882630.33207: waiting for pending results... 30529 1726882630.33595: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882630.33692: in run() - task 12673a56-9f93-b0f1-edc0-000000000f17 30529 1726882630.33863: variable 'ansible_search_path' from source: unknown 30529 1726882630.33866: variable 'ansible_search_path' from source: unknown 30529 1726882630.33870: calling self._execute() 30529 1726882630.33910: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.33942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.33946: variable 'omit' from source: magic vars 30529 1726882630.34663: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.34666: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.35025: variable 'nm_profile_exists' from source: set_fact 30529 1726882630.35042: Evaluated conditional (nm_profile_exists.rc == 0): True 30529 1726882630.35053: variable 'omit' from source: magic vars 30529 1726882630.35152: variable 'omit' from source: magic vars 30529 1726882630.35205: variable 'omit' from source: magic vars 30529 1726882630.35309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882630.35360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882630.35389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882630.35410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.35431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.35483: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882630.35487: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.35489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.35625: Set connection var ansible_shell_executable to /bin/sh 30529 1726882630.35628: Set connection var ansible_pipelining to False 30529 1726882630.35631: Set connection var ansible_shell_type to sh 30529 1726882630.35668: Set connection var ansible_timeout to 10 30529 1726882630.35695: Set connection var ansible_connection to ssh 30529 1726882630.35699: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882630.35701: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.35704: variable 'ansible_connection' from source: unknown 30529 1726882630.35706: variable 'ansible_module_compression' from source: unknown 30529 1726882630.35708: variable 'ansible_shell_type' from source: unknown 30529 1726882630.35710: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.35726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.35736: variable 'ansible_pipelining' from source: unknown 30529 1726882630.35739: variable 'ansible_timeout' from source: unknown 30529 1726882630.35742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.35903: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882630.35909: variable 'omit' from source: magic vars 30529 1726882630.35912: starting attempt loop 30529 1726882630.35914: running the handler 30529 1726882630.35936: handler run complete 30529 1726882630.35944: attempt loop complete, returning result 30529 1726882630.35964: _execute() done 30529 1726882630.35966: dumping result to json 30529 1726882630.35970: done dumping result, returning 30529 1726882630.36005: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-000000000f17] 30529 1726882630.36009: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f17 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30529 1726882630.36170: no more pending results, returning what we have 30529 1726882630.36174: results queue empty 30529 1726882630.36175: checking for any_errors_fatal 30529 1726882630.36195: done checking for any_errors_fatal 30529 1726882630.36198: checking for max_fail_percentage 30529 1726882630.36200: done checking for max_fail_percentage 30529 1726882630.36201: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.36202: done checking to see if all hosts have failed 30529 1726882630.36203: getting the remaining hosts for this loop 30529 1726882630.36207: done getting the remaining hosts for this loop 30529 1726882630.36215: getting the next task for host managed_node1 30529 1726882630.36227: done getting next task for host managed_node1 30529 1726882630.36230: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882630.36240: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.36244: getting variables 30529 1726882630.36247: in VariableManager get_vars() 30529 1726882630.36282: Calling all_inventory to load vars for managed_node1 30529 1726882630.36284: Calling groups_inventory to load vars for managed_node1 30529 1726882630.36288: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.36306: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.36309: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.36312: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.36890: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f17 30529 1726882630.36901: WORKER PROCESS EXITING 30529 1726882630.37648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.39418: done with get_vars() 30529 1726882630.39440: done getting variables 30529 1726882630.39625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.39769: variable 'profile' from source: play vars 30529 1726882630.39774: variable 'interface' from source: play vars 30529 1726882630.39864: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:37:10 -0400 (0:00:00.071) 0:00:44.425 ****** 30529 1726882630.39911: entering _queue_task() for managed_node1/command 30529 1726882630.40265: worker is 1 (out of 1 available) 30529 1726882630.40279: exiting _queue_task() for managed_node1/command 30529 1726882630.40297: done queuing things up, now waiting for results queue to drain 30529 1726882630.40298: waiting for pending results... 30529 1726882630.40719: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882630.40758: in run() - task 12673a56-9f93-b0f1-edc0-000000000f19 30529 1726882630.40776: variable 'ansible_search_path' from source: unknown 30529 1726882630.40787: variable 'ansible_search_path' from source: unknown 30529 1726882630.40838: calling self._execute() 30529 1726882630.40964: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.40980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.41003: variable 'omit' from source: magic vars 30529 1726882630.41402: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.41428: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.41576: variable 'profile_stat' from source: set_fact 30529 1726882630.41599: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882630.41613: when evaluation is False, skipping this task 30529 1726882630.41623: _execute() done 30529 1726882630.41635: dumping result to json 30529 1726882630.41642: done dumping result, returning 30529 1726882630.41660: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000f19] 30529 1726882630.41672: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f19 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882630.41864: no more pending results, returning what we have 30529 1726882630.41868: results queue empty 30529 1726882630.41872: checking for any_errors_fatal 30529 1726882630.41883: done checking for any_errors_fatal 30529 1726882630.41884: checking for max_fail_percentage 30529 1726882630.41886: done checking for max_fail_percentage 30529 1726882630.41887: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.41888: done checking to see if all hosts have failed 30529 1726882630.41888: getting the remaining hosts for this loop 30529 1726882630.41894: done getting the remaining hosts for this loop 30529 1726882630.41900: getting the next task for host managed_node1 30529 1726882630.41910: done getting next task for host managed_node1 30529 1726882630.41913: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882630.41918: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.41923: getting variables 30529 1726882630.41924: in VariableManager get_vars() 30529 1726882630.41959: Calling all_inventory to load vars for managed_node1 30529 1726882630.41961: Calling groups_inventory to load vars for managed_node1 30529 1726882630.41965: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.41985: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.41989: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.42196: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.42807: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f19 30529 1726882630.42811: WORKER PROCESS EXITING 30529 1726882630.43532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.44529: done with get_vars() 30529 1726882630.44545: done getting variables 30529 1726882630.44588: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.44667: variable 'profile' from source: play vars 30529 1726882630.44670: variable 'interface' from source: play vars 30529 1726882630.44712: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:37:10 -0400 (0:00:00.048) 0:00:44.473 ****** 30529 1726882630.44735: entering _queue_task() for managed_node1/set_fact 30529 1726882630.45004: worker is 1 (out of 1 available) 30529 1726882630.45019: exiting _queue_task() for managed_node1/set_fact 30529 1726882630.45031: done queuing things up, now waiting for results queue to drain 30529 1726882630.45033: waiting for pending results... 30529 1726882630.45275: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882630.45400: in run() - task 12673a56-9f93-b0f1-edc0-000000000f1a 30529 1726882630.45430: variable 'ansible_search_path' from source: unknown 30529 1726882630.45437: variable 'ansible_search_path' from source: unknown 30529 1726882630.45470: calling self._execute() 30529 1726882630.45542: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.45555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.45562: variable 'omit' from source: magic vars 30529 1726882630.45971: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.46058: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.46232: variable 'profile_stat' from source: set_fact 30529 1726882630.46419: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882630.46422: when evaluation is False, skipping this task 30529 1726882630.46424: _execute() done 30529 1726882630.46427: dumping result to json 30529 1726882630.46429: done dumping result, returning 30529 1726882630.46432: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000f1a] 30529 1726882630.46434: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1a 30529 1726882630.46510: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1a 30529 1726882630.46513: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882630.46576: no more pending results, returning what we have 30529 1726882630.46581: results queue empty 30529 1726882630.46582: checking for any_errors_fatal 30529 1726882630.46588: done checking for any_errors_fatal 30529 1726882630.46589: checking for max_fail_percentage 30529 1726882630.46590: done checking for max_fail_percentage 30529 1726882630.46591: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.46592: done checking to see if all hosts have failed 30529 1726882630.46595: getting the remaining hosts for this loop 30529 1726882630.46597: done getting the remaining hosts for this loop 30529 1726882630.46601: getting the next task for host managed_node1 30529 1726882630.46611: done getting next task for host managed_node1 30529 1726882630.46613: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882630.46618: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.46625: getting variables 30529 1726882630.46627: in VariableManager get_vars() 30529 1726882630.46669: Calling all_inventory to load vars for managed_node1 30529 1726882630.46672: Calling groups_inventory to load vars for managed_node1 30529 1726882630.46678: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.46815: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.46822: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.46826: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.48769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.51248: done with get_vars() 30529 1726882630.51274: done getting variables 30529 1726882630.51342: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.51461: variable 'profile' from source: play vars 30529 1726882630.51465: variable 'interface' from source: play vars 30529 1726882630.51526: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:37:10 -0400 (0:00:00.068) 0:00:44.541 ****** 30529 1726882630.51562: entering _queue_task() for managed_node1/command 30529 1726882630.51992: worker is 1 (out of 1 available) 30529 1726882630.52251: exiting _queue_task() for managed_node1/command 30529 1726882630.52264: done queuing things up, now waiting for results queue to drain 30529 1726882630.52265: waiting for pending results... 30529 1726882630.52619: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882630.52624: in run() - task 12673a56-9f93-b0f1-edc0-000000000f1b 30529 1726882630.52627: variable 'ansible_search_path' from source: unknown 30529 1726882630.52630: variable 'ansible_search_path' from source: unknown 30529 1726882630.52633: calling self._execute() 30529 1726882630.52700: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.52721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.52736: variable 'omit' from source: magic vars 30529 1726882630.53133: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.53157: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.53295: variable 'profile_stat' from source: set_fact 30529 1726882630.53314: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882630.53323: when evaluation is False, skipping this task 30529 1726882630.53330: _execute() done 30529 1726882630.53337: dumping result to json 30529 1726882630.53344: done dumping result, returning 30529 1726882630.53355: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000f1b] 30529 1726882630.53371: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1b 30529 1726882630.53545: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1b 30529 1726882630.53548: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882630.53638: no more pending results, returning what we have 30529 1726882630.53643: results queue empty 30529 1726882630.53644: checking for any_errors_fatal 30529 1726882630.53650: done checking for any_errors_fatal 30529 1726882630.53651: checking for max_fail_percentage 30529 1726882630.53653: done checking for max_fail_percentage 30529 1726882630.53654: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.53655: done checking to see if all hosts have failed 30529 1726882630.53656: getting the remaining hosts for this loop 30529 1726882630.53658: done getting the remaining hosts for this loop 30529 1726882630.53661: getting the next task for host managed_node1 30529 1726882630.53671: done getting next task for host managed_node1 30529 1726882630.53673: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882630.53679: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.53798: getting variables 30529 1726882630.53800: in VariableManager get_vars() 30529 1726882630.53838: Calling all_inventory to load vars for managed_node1 30529 1726882630.53841: Calling groups_inventory to load vars for managed_node1 30529 1726882630.53844: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.53857: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.53861: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.53864: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.55536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.58046: done with get_vars() 30529 1726882630.58071: done getting variables 30529 1726882630.58141: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.58520: variable 'profile' from source: play vars 30529 1726882630.58524: variable 'interface' from source: play vars 30529 1726882630.58702: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:37:10 -0400 (0:00:00.071) 0:00:44.613 ****** 30529 1726882630.58734: entering _queue_task() for managed_node1/set_fact 30529 1726882630.59539: worker is 1 (out of 1 available) 30529 1726882630.59550: exiting _queue_task() for managed_node1/set_fact 30529 1726882630.59562: done queuing things up, now waiting for results queue to drain 30529 1726882630.59563: waiting for pending results... 30529 1726882630.59975: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882630.60071: in run() - task 12673a56-9f93-b0f1-edc0-000000000f1c 30529 1726882630.60077: variable 'ansible_search_path' from source: unknown 30529 1726882630.60079: variable 'ansible_search_path' from source: unknown 30529 1726882630.60105: calling self._execute() 30529 1726882630.60202: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.60206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.60214: variable 'omit' from source: magic vars 30529 1726882630.60564: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.60580: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.61103: variable 'profile_stat' from source: set_fact 30529 1726882630.61106: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882630.61108: when evaluation is False, skipping this task 30529 1726882630.61110: _execute() done 30529 1726882630.61111: dumping result to json 30529 1726882630.61113: done dumping result, returning 30529 1726882630.61116: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000000f1c] 30529 1726882630.61117: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1c 30529 1726882630.61289: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f1c 30529 1726882630.61299: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882630.61334: no more pending results, returning what we have 30529 1726882630.61338: results queue empty 30529 1726882630.61339: checking for any_errors_fatal 30529 1726882630.61345: done checking for any_errors_fatal 30529 1726882630.61345: checking for max_fail_percentage 30529 1726882630.61347: done checking for max_fail_percentage 30529 1726882630.61348: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.61349: done checking to see if all hosts have failed 30529 1726882630.61349: getting the remaining hosts for this loop 30529 1726882630.61351: done getting the remaining hosts for this loop 30529 1726882630.61354: getting the next task for host managed_node1 30529 1726882630.61362: done getting next task for host managed_node1 30529 1726882630.61365: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30529 1726882630.61369: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.61373: getting variables 30529 1726882630.61375: in VariableManager get_vars() 30529 1726882630.61404: Calling all_inventory to load vars for managed_node1 30529 1726882630.61407: Calling groups_inventory to load vars for managed_node1 30529 1726882630.61410: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.61422: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.61426: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.61429: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.63529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.64976: done with get_vars() 30529 1726882630.65005: done getting variables 30529 1726882630.65076: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.65223: variable 'profile' from source: play vars 30529 1726882630.65227: variable 'interface' from source: play vars 30529 1726882630.65292: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:37:10 -0400 (0:00:00.065) 0:00:44.679 ****** 30529 1726882630.65317: entering _queue_task() for managed_node1/assert 30529 1726882630.65675: worker is 1 (out of 1 available) 30529 1726882630.65690: exiting _queue_task() for managed_node1/assert 30529 1726882630.65706: done queuing things up, now waiting for results queue to drain 30529 1726882630.65708: waiting for pending results... 30529 1726882630.65911: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' 30529 1726882630.66210: in run() - task 12673a56-9f93-b0f1-edc0-000000000e8c 30529 1726882630.66215: variable 'ansible_search_path' from source: unknown 30529 1726882630.66218: variable 'ansible_search_path' from source: unknown 30529 1726882630.66221: calling self._execute() 30529 1726882630.66248: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.66265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.66286: variable 'omit' from source: magic vars 30529 1726882630.66977: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.67077: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.67082: variable 'omit' from source: magic vars 30529 1726882630.67085: variable 'omit' from source: magic vars 30529 1726882630.67266: variable 'profile' from source: play vars 30529 1726882630.67282: variable 'interface' from source: play vars 30529 1726882630.67374: variable 'interface' from source: play vars 30529 1726882630.67412: variable 'omit' from source: magic vars 30529 1726882630.67480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882630.67562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882630.67599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882630.67647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.67698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.67708: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882630.67716: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.67728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.67870: Set connection var ansible_shell_executable to /bin/sh 30529 1726882630.67972: Set connection var ansible_pipelining to False 30529 1726882630.67977: Set connection var ansible_shell_type to sh 30529 1726882630.67979: Set connection var ansible_timeout to 10 30529 1726882630.67981: Set connection var ansible_connection to ssh 30529 1726882630.67983: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882630.67985: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.67986: variable 'ansible_connection' from source: unknown 30529 1726882630.67988: variable 'ansible_module_compression' from source: unknown 30529 1726882630.67990: variable 'ansible_shell_type' from source: unknown 30529 1726882630.67991: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.68001: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.68004: variable 'ansible_pipelining' from source: unknown 30529 1726882630.68006: variable 'ansible_timeout' from source: unknown 30529 1726882630.68008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.68243: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882630.68246: variable 'omit' from source: magic vars 30529 1726882630.68249: starting attempt loop 30529 1726882630.68251: running the handler 30529 1726882630.68384: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882630.68398: Evaluated conditional (lsr_net_profile_exists): True 30529 1726882630.68410: handler run complete 30529 1726882630.68440: attempt loop complete, returning result 30529 1726882630.68470: _execute() done 30529 1726882630.68480: dumping result to json 30529 1726882630.68498: done dumping result, returning 30529 1726882630.68552: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'statebr' [12673a56-9f93-b0f1-edc0-000000000e8c] 30529 1726882630.68555: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8c ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882630.68845: no more pending results, returning what we have 30529 1726882630.68850: results queue empty 30529 1726882630.68851: checking for any_errors_fatal 30529 1726882630.68857: done checking for any_errors_fatal 30529 1726882630.68858: checking for max_fail_percentage 30529 1726882630.68859: done checking for max_fail_percentage 30529 1726882630.68860: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.68862: done checking to see if all hosts have failed 30529 1726882630.68863: getting the remaining hosts for this loop 30529 1726882630.68864: done getting the remaining hosts for this loop 30529 1726882630.68868: getting the next task for host managed_node1 30529 1726882630.68878: done getting next task for host managed_node1 30529 1726882630.68884: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30529 1726882630.68888: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.68892: getting variables 30529 1726882630.68896: in VariableManager get_vars() 30529 1726882630.68938: Calling all_inventory to load vars for managed_node1 30529 1726882630.68941: Calling groups_inventory to load vars for managed_node1 30529 1726882630.68945: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.68956: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.68960: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.68963: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.69525: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8c 30529 1726882630.69529: WORKER PROCESS EXITING 30529 1726882630.77554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.79846: done with get_vars() 30529 1726882630.79941: done getting variables 30529 1726882630.80000: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.80318: variable 'profile' from source: play vars 30529 1726882630.80322: variable 'interface' from source: play vars 30529 1726882630.80498: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:37:10 -0400 (0:00:00.152) 0:00:44.831 ****** 30529 1726882630.80543: entering _queue_task() for managed_node1/assert 30529 1726882630.81713: worker is 1 (out of 1 available) 30529 1726882630.81786: exiting _queue_task() for managed_node1/assert 30529 1726882630.81803: done queuing things up, now waiting for results queue to drain 30529 1726882630.81806: waiting for pending results... 30529 1726882630.82087: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' 30529 1726882630.82247: in run() - task 12673a56-9f93-b0f1-edc0-000000000e8d 30529 1726882630.82277: variable 'ansible_search_path' from source: unknown 30529 1726882630.82327: variable 'ansible_search_path' from source: unknown 30529 1726882630.82336: calling self._execute() 30529 1726882630.82446: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.82458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.82473: variable 'omit' from source: magic vars 30529 1726882630.82935: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.82939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.82941: variable 'omit' from source: magic vars 30529 1726882630.82950: variable 'omit' from source: magic vars 30529 1726882630.83049: variable 'profile' from source: play vars 30529 1726882630.83058: variable 'interface' from source: play vars 30529 1726882630.83129: variable 'interface' from source: play vars 30529 1726882630.83156: variable 'omit' from source: magic vars 30529 1726882630.83203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882630.83245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882630.83371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882630.83374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.83377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.83379: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882630.83384: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.83392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.83651: Set connection var ansible_shell_executable to /bin/sh 30529 1726882630.83848: Set connection var ansible_pipelining to False 30529 1726882630.83851: Set connection var ansible_shell_type to sh 30529 1726882630.83853: Set connection var ansible_timeout to 10 30529 1726882630.83855: Set connection var ansible_connection to ssh 30529 1726882630.83858: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882630.84069: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.84709: variable 'ansible_connection' from source: unknown 30529 1726882630.84712: variable 'ansible_module_compression' from source: unknown 30529 1726882630.84715: variable 'ansible_shell_type' from source: unknown 30529 1726882630.84717: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.84719: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.84721: variable 'ansible_pipelining' from source: unknown 30529 1726882630.84723: variable 'ansible_timeout' from source: unknown 30529 1726882630.84725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.84727: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882630.84730: variable 'omit' from source: magic vars 30529 1726882630.84731: starting attempt loop 30529 1726882630.84733: running the handler 30529 1726882630.85063: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30529 1726882630.85075: Evaluated conditional (lsr_net_profile_ansible_managed): True 30529 1726882630.85254: handler run complete 30529 1726882630.85257: attempt loop complete, returning result 30529 1726882630.85260: _execute() done 30529 1726882630.85263: dumping result to json 30529 1726882630.85265: done dumping result, returning 30529 1726882630.85268: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'statebr' [12673a56-9f93-b0f1-edc0-000000000e8d] 30529 1726882630.85271: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8d 30529 1726882630.85538: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8d 30529 1726882630.85541: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882630.85629: no more pending results, returning what we have 30529 1726882630.85633: results queue empty 30529 1726882630.85634: checking for any_errors_fatal 30529 1726882630.85647: done checking for any_errors_fatal 30529 1726882630.85648: checking for max_fail_percentage 30529 1726882630.85650: done checking for max_fail_percentage 30529 1726882630.85651: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.85652: done checking to see if all hosts have failed 30529 1726882630.85653: getting the remaining hosts for this loop 30529 1726882630.85655: done getting the remaining hosts for this loop 30529 1726882630.85659: getting the next task for host managed_node1 30529 1726882630.85667: done getting next task for host managed_node1 30529 1726882630.85671: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30529 1726882630.85674: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.85680: getting variables 30529 1726882630.85683: in VariableManager get_vars() 30529 1726882630.85724: Calling all_inventory to load vars for managed_node1 30529 1726882630.85727: Calling groups_inventory to load vars for managed_node1 30529 1726882630.85732: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.85744: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.85748: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.85751: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.89275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.92047: done with get_vars() 30529 1726882630.92074: done getting variables 30529 1726882630.92136: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882630.92266: variable 'profile' from source: play vars 30529 1726882630.92269: variable 'interface' from source: play vars 30529 1726882630.92349: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:37:10 -0400 (0:00:00.118) 0:00:44.949 ****** 30529 1726882630.92386: entering _queue_task() for managed_node1/assert 30529 1726882630.92741: worker is 1 (out of 1 available) 30529 1726882630.92756: exiting _queue_task() for managed_node1/assert 30529 1726882630.92767: done queuing things up, now waiting for results queue to drain 30529 1726882630.92772: waiting for pending results... 30529 1726882630.93101: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr 30529 1726882630.93250: in run() - task 12673a56-9f93-b0f1-edc0-000000000e8e 30529 1726882630.93271: variable 'ansible_search_path' from source: unknown 30529 1726882630.93279: variable 'ansible_search_path' from source: unknown 30529 1726882630.93333: calling self._execute() 30529 1726882630.93432: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.93443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.93458: variable 'omit' from source: magic vars 30529 1726882630.93846: variable 'ansible_distribution_major_version' from source: facts 30529 1726882630.93878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882630.93896: variable 'omit' from source: magic vars 30529 1726882630.93948: variable 'omit' from source: magic vars 30529 1726882630.94058: variable 'profile' from source: play vars 30529 1726882630.94102: variable 'interface' from source: play vars 30529 1726882630.94163: variable 'interface' from source: play vars 30529 1726882630.94299: variable 'omit' from source: magic vars 30529 1726882630.94302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882630.94305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882630.94307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882630.94322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.94338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882630.94373: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882630.94380: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.94387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.94498: Set connection var ansible_shell_executable to /bin/sh 30529 1726882630.94509: Set connection var ansible_pipelining to False 30529 1726882630.94516: Set connection var ansible_shell_type to sh 30529 1726882630.94535: Set connection var ansible_timeout to 10 30529 1726882630.94541: Set connection var ansible_connection to ssh 30529 1726882630.94551: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882630.94577: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.94584: variable 'ansible_connection' from source: unknown 30529 1726882630.94599: variable 'ansible_module_compression' from source: unknown 30529 1726882630.94608: variable 'ansible_shell_type' from source: unknown 30529 1726882630.94615: variable 'ansible_shell_executable' from source: unknown 30529 1726882630.94639: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882630.94642: variable 'ansible_pipelining' from source: unknown 30529 1726882630.94644: variable 'ansible_timeout' from source: unknown 30529 1726882630.94646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882630.94785: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882630.94857: variable 'omit' from source: magic vars 30529 1726882630.94860: starting attempt loop 30529 1726882630.94865: running the handler 30529 1726882630.94946: variable 'lsr_net_profile_fingerprint' from source: set_fact 30529 1726882630.94956: Evaluated conditional (lsr_net_profile_fingerprint): True 30529 1726882630.94971: handler run complete 30529 1726882630.94994: attempt loop complete, returning result 30529 1726882630.95003: _execute() done 30529 1726882630.95074: dumping result to json 30529 1726882630.95078: done dumping result, returning 30529 1726882630.95080: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in statebr [12673a56-9f93-b0f1-edc0-000000000e8e] 30529 1726882630.95083: sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8e 30529 1726882630.95153: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000e8e 30529 1726882630.95156: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882630.95231: no more pending results, returning what we have 30529 1726882630.95235: results queue empty 30529 1726882630.95237: checking for any_errors_fatal 30529 1726882630.95243: done checking for any_errors_fatal 30529 1726882630.95244: checking for max_fail_percentage 30529 1726882630.95246: done checking for max_fail_percentage 30529 1726882630.95247: checking to see if all hosts have failed and the running result is not ok 30529 1726882630.95248: done checking to see if all hosts have failed 30529 1726882630.95249: getting the remaining hosts for this loop 30529 1726882630.95251: done getting the remaining hosts for this loop 30529 1726882630.95255: getting the next task for host managed_node1 30529 1726882630.95266: done getting next task for host managed_node1 30529 1726882630.95270: ^ task is: TASK: Conditional asserts 30529 1726882630.95273: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882630.95278: getting variables 30529 1726882630.95282: in VariableManager get_vars() 30529 1726882630.95327: Calling all_inventory to load vars for managed_node1 30529 1726882630.95330: Calling groups_inventory to load vars for managed_node1 30529 1726882630.95335: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882630.95348: Calling all_plugins_play to load vars for managed_node1 30529 1726882630.95352: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882630.95355: Calling groups_plugins_play to load vars for managed_node1 30529 1726882630.97016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882630.98713: done with get_vars() 30529 1726882630.98748: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:37:10 -0400 (0:00:00.064) 0:00:45.014 ****** 30529 1726882630.98876: entering _queue_task() for managed_node1/include_tasks 30529 1726882630.99262: worker is 1 (out of 1 available) 30529 1726882630.99277: exiting _queue_task() for managed_node1/include_tasks 30529 1726882630.99399: done queuing things up, now waiting for results queue to drain 30529 1726882630.99401: waiting for pending results... 30529 1726882630.99626: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882631.00099: in run() - task 12673a56-9f93-b0f1-edc0-000000000a4f 30529 1726882631.00103: variable 'ansible_search_path' from source: unknown 30529 1726882631.00106: variable 'ansible_search_path' from source: unknown 30529 1726882631.00364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882631.03079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882631.03153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882631.03202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882631.03242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882631.03280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882631.03382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882631.03422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882631.03452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882631.03507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882631.03530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882631.03695: dumping result to json 30529 1726882631.03798: done dumping result, returning 30529 1726882631.03801: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-000000000a4f] 30529 1726882631.03803: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4f 30529 1726882631.04099: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a4f 30529 1726882631.04103: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } 30529 1726882631.04162: no more pending results, returning what we have 30529 1726882631.04166: results queue empty 30529 1726882631.04167: checking for any_errors_fatal 30529 1726882631.04173: done checking for any_errors_fatal 30529 1726882631.04174: checking for max_fail_percentage 30529 1726882631.04176: done checking for max_fail_percentage 30529 1726882631.04177: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.04178: done checking to see if all hosts have failed 30529 1726882631.04178: getting the remaining hosts for this loop 30529 1726882631.04180: done getting the remaining hosts for this loop 30529 1726882631.04187: getting the next task for host managed_node1 30529 1726882631.04201: done getting next task for host managed_node1 30529 1726882631.04204: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882631.04207: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.04211: getting variables 30529 1726882631.04213: in VariableManager get_vars() 30529 1726882631.04252: Calling all_inventory to load vars for managed_node1 30529 1726882631.04255: Calling groups_inventory to load vars for managed_node1 30529 1726882631.04259: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.04269: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.04272: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.04275: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.06480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.08627: done with get_vars() 30529 1726882631.08648: done getting variables 30529 1726882631.08709: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882631.08833: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:37:11 -0400 (0:00:00.099) 0:00:45.114 ****** 30529 1726882631.08863: entering _queue_task() for managed_node1/debug 30529 1726882631.09207: worker is 1 (out of 1 available) 30529 1726882631.09221: exiting _queue_task() for managed_node1/debug 30529 1726882631.09233: done queuing things up, now waiting for results queue to drain 30529 1726882631.09235: waiting for pending results... 30529 1726882631.09530: running TaskExecutor() for managed_node1/TASK: Success in test 'I can activate an existing profile' 30529 1726882631.09677: in run() - task 12673a56-9f93-b0f1-edc0-000000000a50 30529 1726882631.09705: variable 'ansible_search_path' from source: unknown 30529 1726882631.09715: variable 'ansible_search_path' from source: unknown 30529 1726882631.09761: calling self._execute() 30529 1726882631.09863: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.09878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.09900: variable 'omit' from source: magic vars 30529 1726882631.10398: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.10402: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.10405: variable 'omit' from source: magic vars 30529 1726882631.10408: variable 'omit' from source: magic vars 30529 1726882631.10502: variable 'lsr_description' from source: include params 30529 1726882631.10543: variable 'omit' from source: magic vars 30529 1726882631.10824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882631.11107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.11111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882631.11114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.11116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.11118: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.11120: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.11123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.11398: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.11408: Set connection var ansible_pipelining to False 30529 1726882631.11435: Set connection var ansible_shell_type to sh 30529 1726882631.11458: Set connection var ansible_timeout to 10 30529 1726882631.11800: Set connection var ansible_connection to ssh 30529 1726882631.11803: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.11806: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.11808: variable 'ansible_connection' from source: unknown 30529 1726882631.11810: variable 'ansible_module_compression' from source: unknown 30529 1726882631.11812: variable 'ansible_shell_type' from source: unknown 30529 1726882631.11815: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.11817: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.11819: variable 'ansible_pipelining' from source: unknown 30529 1726882631.11821: variable 'ansible_timeout' from source: unknown 30529 1726882631.11823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.11933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.11959: variable 'omit' from source: magic vars 30529 1726882631.12030: starting attempt loop 30529 1726882631.12055: running the handler 30529 1726882631.12248: handler run complete 30529 1726882631.12278: attempt loop complete, returning result 30529 1726882631.12287: _execute() done 30529 1726882631.12300: dumping result to json 30529 1726882631.12308: done dumping result, returning 30529 1726882631.12325: done running TaskExecutor() for managed_node1/TASK: Success in test 'I can activate an existing profile' [12673a56-9f93-b0f1-edc0-000000000a50] 30529 1726882631.12336: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a50 ok: [managed_node1] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 30529 1726882631.12565: no more pending results, returning what we have 30529 1726882631.12568: results queue empty 30529 1726882631.12570: checking for any_errors_fatal 30529 1726882631.12577: done checking for any_errors_fatal 30529 1726882631.12578: checking for max_fail_percentage 30529 1726882631.12580: done checking for max_fail_percentage 30529 1726882631.12581: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.12582: done checking to see if all hosts have failed 30529 1726882631.12582: getting the remaining hosts for this loop 30529 1726882631.12584: done getting the remaining hosts for this loop 30529 1726882631.12588: getting the next task for host managed_node1 30529 1726882631.12603: done getting next task for host managed_node1 30529 1726882631.12607: ^ task is: TASK: Cleanup 30529 1726882631.12610: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.12617: getting variables 30529 1726882631.12619: in VariableManager get_vars() 30529 1726882631.12651: Calling all_inventory to load vars for managed_node1 30529 1726882631.12654: Calling groups_inventory to load vars for managed_node1 30529 1726882631.12658: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.12675: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.12680: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.12683: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.12698: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a50 30529 1726882631.12701: WORKER PROCESS EXITING 30529 1726882631.14326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.15895: done with get_vars() 30529 1726882631.15915: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:37:11 -0400 (0:00:00.071) 0:00:45.186 ****** 30529 1726882631.16011: entering _queue_task() for managed_node1/include_tasks 30529 1726882631.16326: worker is 1 (out of 1 available) 30529 1726882631.16337: exiting _queue_task() for managed_node1/include_tasks 30529 1726882631.16350: done queuing things up, now waiting for results queue to drain 30529 1726882631.16352: waiting for pending results... 30529 1726882631.16651: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882631.16801: in run() - task 12673a56-9f93-b0f1-edc0-000000000a54 30529 1726882631.16805: variable 'ansible_search_path' from source: unknown 30529 1726882631.16807: variable 'ansible_search_path' from source: unknown 30529 1726882631.16844: variable 'lsr_cleanup' from source: include params 30529 1726882631.17198: variable 'lsr_cleanup' from source: include params 30529 1726882631.17202: variable 'omit' from source: magic vars 30529 1726882631.17269: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.17284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.17304: variable 'omit' from source: magic vars 30529 1726882631.17551: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.17566: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.17579: variable 'item' from source: unknown 30529 1726882631.17652: variable 'item' from source: unknown 30529 1726882631.17687: variable 'item' from source: unknown 30529 1726882631.17755: variable 'item' from source: unknown 30529 1726882631.18028: dumping result to json 30529 1726882631.18032: done dumping result, returning 30529 1726882631.18034: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-000000000a54] 30529 1726882631.18040: sending task result for task 12673a56-9f93-b0f1-edc0-000000000a54 30529 1726882631.18088: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000a54 30529 1726882631.18122: no more pending results, returning what we have 30529 1726882631.18130: in VariableManager get_vars() 30529 1726882631.18167: Calling all_inventory to load vars for managed_node1 30529 1726882631.18169: Calling groups_inventory to load vars for managed_node1 30529 1726882631.18173: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.18187: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.18196: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.18203: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.18806: WORKER PROCESS EXITING 30529 1726882631.19796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.20880: done with get_vars() 30529 1726882631.20897: variable 'ansible_search_path' from source: unknown 30529 1726882631.20898: variable 'ansible_search_path' from source: unknown 30529 1726882631.20926: we have included files to process 30529 1726882631.20926: generating all_blocks data 30529 1726882631.20928: done generating all_blocks data 30529 1726882631.20933: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882631.20933: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882631.20935: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882631.21084: done processing included file 30529 1726882631.21086: iterating over new_blocks loaded from include file 30529 1726882631.21087: in VariableManager get_vars() 30529 1726882631.21101: done with get_vars() 30529 1726882631.21103: filtering new block on tags 30529 1726882631.21119: done filtering new block on tags 30529 1726882631.21123: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882631.21128: extending task lists for all hosts with included blocks 30529 1726882631.22362: done extending task lists 30529 1726882631.22363: done processing included files 30529 1726882631.22364: results queue empty 30529 1726882631.22365: checking for any_errors_fatal 30529 1726882631.22367: done checking for any_errors_fatal 30529 1726882631.22368: checking for max_fail_percentage 30529 1726882631.22369: done checking for max_fail_percentage 30529 1726882631.22369: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.22370: done checking to see if all hosts have failed 30529 1726882631.22371: getting the remaining hosts for this loop 30529 1726882631.22372: done getting the remaining hosts for this loop 30529 1726882631.22374: getting the next task for host managed_node1 30529 1726882631.22377: done getting next task for host managed_node1 30529 1726882631.22379: ^ task is: TASK: Cleanup profile and device 30529 1726882631.22381: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.22383: getting variables 30529 1726882631.22384: in VariableManager get_vars() 30529 1726882631.22397: Calling all_inventory to load vars for managed_node1 30529 1726882631.22399: Calling groups_inventory to load vars for managed_node1 30529 1726882631.22401: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.22406: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.22409: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.22411: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.23546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.24487: done with get_vars() 30529 1726882631.24505: done getting variables 30529 1726882631.24533: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:37:11 -0400 (0:00:00.085) 0:00:45.271 ****** 30529 1726882631.24553: entering _queue_task() for managed_node1/shell 30529 1726882631.24803: worker is 1 (out of 1 available) 30529 1726882631.24815: exiting _queue_task() for managed_node1/shell 30529 1726882631.24828: done queuing things up, now waiting for results queue to drain 30529 1726882631.24829: waiting for pending results... 30529 1726882631.25035: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882631.25204: in run() - task 12673a56-9f93-b0f1-edc0-000000000f6d 30529 1726882631.25227: variable 'ansible_search_path' from source: unknown 30529 1726882631.25236: variable 'ansible_search_path' from source: unknown 30529 1726882631.25298: calling self._execute() 30529 1726882631.25599: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.25603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.25605: variable 'omit' from source: magic vars 30529 1726882631.25898: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.25924: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.25958: variable 'omit' from source: magic vars 30529 1726882631.26051: variable 'omit' from source: magic vars 30529 1726882631.26268: variable 'interface' from source: play vars 30529 1726882631.26272: variable 'omit' from source: magic vars 30529 1726882631.26321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882631.26398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.26406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882631.26421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.26432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.26454: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.26458: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.26460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.26619: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.26622: Set connection var ansible_pipelining to False 30529 1726882631.26624: Set connection var ansible_shell_type to sh 30529 1726882631.26627: Set connection var ansible_timeout to 10 30529 1726882631.26629: Set connection var ansible_connection to ssh 30529 1726882631.26631: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.26653: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.26656: variable 'ansible_connection' from source: unknown 30529 1726882631.26659: variable 'ansible_module_compression' from source: unknown 30529 1726882631.26662: variable 'ansible_shell_type' from source: unknown 30529 1726882631.26667: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.26670: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.26672: variable 'ansible_pipelining' from source: unknown 30529 1726882631.26674: variable 'ansible_timeout' from source: unknown 30529 1726882631.26691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.26817: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.26821: variable 'omit' from source: magic vars 30529 1726882631.26840: starting attempt loop 30529 1726882631.26844: running the handler 30529 1726882631.26945: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.26953: _low_level_execute_command(): starting 30529 1726882631.26956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882631.27527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882631.27563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.27633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.29313: stdout chunk (state=3): >>>/root <<< 30529 1726882631.29408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.29441: stderr chunk (state=3): >>><<< 30529 1726882631.29444: stdout chunk (state=3): >>><<< 30529 1726882631.29457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882631.29468: _low_level_execute_command(): starting 30529 1726882631.29474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785 `" && echo ansible-tmp-1726882631.2945766-32706-158391587223785="` echo /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785 `" ) && sleep 0' 30529 1726882631.30099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.30112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882631.30118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.30151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882631.30165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.30231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.32334: stdout chunk (state=3): >>>ansible-tmp-1726882631.2945766-32706-158391587223785=/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785 <<< 30529 1726882631.32342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.32345: stderr chunk (state=3): >>><<< 30529 1726882631.32347: stdout chunk (state=3): >>><<< 30529 1726882631.32350: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882631.2945766-32706-158391587223785=/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882631.32357: variable 'ansible_module_compression' from source: unknown 30529 1726882631.32549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882631.32552: variable 'ansible_facts' from source: unknown 30529 1726882631.32608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py 30529 1726882631.32834: Sending initial data 30529 1726882631.32837: Sent initial data (156 bytes) 30529 1726882631.33780: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882631.33792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882631.33797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.33817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.35350: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882631.35366: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882631.35379: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882631.35396: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30529 1726882631.35442: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882631.35528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882631.35574: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpau52fbxd /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py <<< 30529 1726882631.35578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py" <<< 30529 1726882631.35647: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpau52fbxd" to remote "/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py" <<< 30529 1726882631.36496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.36547: stderr chunk (state=3): >>><<< 30529 1726882631.36550: stdout chunk (state=3): >>><<< 30529 1726882631.36634: done transferring module to remote 30529 1726882631.36637: _low_level_execute_command(): starting 30529 1726882631.36640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/ /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py && sleep 0' 30529 1726882631.37264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882631.37300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882631.37303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882631.37305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.37307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882631.37313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.37410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.37473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.39234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.39245: stderr chunk (state=3): >>><<< 30529 1726882631.39269: stdout chunk (state=3): >>><<< 30529 1726882631.39357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882631.39369: _low_level_execute_command(): starting 30529 1726882631.39375: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/AnsiballZ_command.py && sleep 0' 30529 1726882631.39801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882631.39806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882631.39808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882631.39810: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882631.39812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.39861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882631.39867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.39914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.59533: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (6645673c-872c-4c3e-a9a0-f259b2189616) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:37:11.548307", "end": "2024-09-20 21:37:11.593306", "delta": "0:00:00.044999", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882631.61917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.62054: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882631.62058: stdout chunk (state=3): >>><<< 30529 1726882631.62143: stderr chunk (state=3): >>><<< 30529 1726882631.62147: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (6645673c-872c-4c3e-a9a0-f259b2189616) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:37:11.548307", "end": "2024-09-20 21:37:11.593306", "delta": "0:00:00.044999", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882631.62301: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882631.62309: _low_level_execute_command(): starting 30529 1726882631.62315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882631.2945766-32706-158391587223785/ > /dev/null 2>&1 && sleep 0' 30529 1726882631.63694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882631.63730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882631.63745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882631.63840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882631.63997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882631.64021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882631.64046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882631.64156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882631.66004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882631.66054: stderr chunk (state=3): >>><<< 30529 1726882631.66057: stdout chunk (state=3): >>><<< 30529 1726882631.66104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882631.66107: handler run complete 30529 1726882631.66134: Evaluated conditional (False): False 30529 1726882631.66227: attempt loop complete, returning result 30529 1726882631.66231: _execute() done 30529 1726882631.66233: dumping result to json 30529 1726882631.66235: done dumping result, returning 30529 1726882631.66237: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-000000000f6d] 30529 1726882631.66239: sending task result for task 12673a56-9f93-b0f1-edc0-000000000f6d 30529 1726882631.66310: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000f6d 30529 1726882631.66313: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.044999", "end": "2024-09-20 21:37:11.593306", "rc": 0, "start": "2024-09-20 21:37:11.548307" } STDOUT: Connection 'statebr' (6645673c-872c-4c3e-a9a0-f259b2189616) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30529 1726882631.66379: no more pending results, returning what we have 30529 1726882631.66383: results queue empty 30529 1726882631.66384: checking for any_errors_fatal 30529 1726882631.66385: done checking for any_errors_fatal 30529 1726882631.66386: checking for max_fail_percentage 30529 1726882631.66388: done checking for max_fail_percentage 30529 1726882631.66389: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.66390: done checking to see if all hosts have failed 30529 1726882631.66390: getting the remaining hosts for this loop 30529 1726882631.66392: done getting the remaining hosts for this loop 30529 1726882631.66399: getting the next task for host managed_node1 30529 1726882631.66411: done getting next task for host managed_node1 30529 1726882631.66414: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882631.66417: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.66422: getting variables 30529 1726882631.66425: in VariableManager get_vars() 30529 1726882631.66460: Calling all_inventory to load vars for managed_node1 30529 1726882631.66462: Calling groups_inventory to load vars for managed_node1 30529 1726882631.66466: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.66478: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.66481: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.66484: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.69557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.71637: done with get_vars() 30529 1726882631.71661: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 21:37:11 -0400 (0:00:00.472) 0:00:45.743 ****** 30529 1726882631.71769: entering _queue_task() for managed_node1/include_tasks 30529 1726882631.72161: worker is 1 (out of 1 available) 30529 1726882631.72312: exiting _queue_task() for managed_node1/include_tasks 30529 1726882631.72325: done queuing things up, now waiting for results queue to drain 30529 1726882631.72327: waiting for pending results... 30529 1726882631.72917: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882631.72923: in run() - task 12673a56-9f93-b0f1-edc0-000000000013 30529 1726882631.73013: variable 'ansible_search_path' from source: unknown 30529 1726882631.73017: calling self._execute() 30529 1726882631.73501: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.73504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.73509: variable 'omit' from source: magic vars 30529 1726882631.74017: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.74025: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.74031: _execute() done 30529 1726882631.74035: dumping result to json 30529 1726882631.74038: done dumping result, returning 30529 1726882631.74044: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-000000000013] 30529 1726882631.74061: sending task result for task 12673a56-9f93-b0f1-edc0-000000000013 30529 1726882631.74161: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000013 30529 1726882631.74269: no more pending results, returning what we have 30529 1726882631.74281: in VariableManager get_vars() 30529 1726882631.74318: Calling all_inventory to load vars for managed_node1 30529 1726882631.74321: Calling groups_inventory to load vars for managed_node1 30529 1726882631.74324: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.74335: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.74337: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.74340: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.75397: WORKER PROCESS EXITING 30529 1726882631.75411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.77071: done with get_vars() 30529 1726882631.77099: variable 'ansible_search_path' from source: unknown 30529 1726882631.77116: we have included files to process 30529 1726882631.77117: generating all_blocks data 30529 1726882631.77119: done generating all_blocks data 30529 1726882631.77123: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882631.77124: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882631.77127: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882631.77576: in VariableManager get_vars() 30529 1726882631.77599: done with get_vars() 30529 1726882631.77646: in VariableManager get_vars() 30529 1726882631.77662: done with get_vars() 30529 1726882631.77709: in VariableManager get_vars() 30529 1726882631.77732: done with get_vars() 30529 1726882631.77779: in VariableManager get_vars() 30529 1726882631.77806: done with get_vars() 30529 1726882631.77851: in VariableManager get_vars() 30529 1726882631.77870: done with get_vars() 30529 1726882631.78251: in VariableManager get_vars() 30529 1726882631.78271: done with get_vars() 30529 1726882631.78280: done processing included file 30529 1726882631.78281: iterating over new_blocks loaded from include file 30529 1726882631.78282: in VariableManager get_vars() 30529 1726882631.78289: done with get_vars() 30529 1726882631.78294: filtering new block on tags 30529 1726882631.78429: done filtering new block on tags 30529 1726882631.78432: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882631.78437: extending task lists for all hosts with included blocks 30529 1726882631.78469: done extending task lists 30529 1726882631.78470: done processing included files 30529 1726882631.78471: results queue empty 30529 1726882631.78472: checking for any_errors_fatal 30529 1726882631.78475: done checking for any_errors_fatal 30529 1726882631.78476: checking for max_fail_percentage 30529 1726882631.78477: done checking for max_fail_percentage 30529 1726882631.78478: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.78478: done checking to see if all hosts have failed 30529 1726882631.78479: getting the remaining hosts for this loop 30529 1726882631.78480: done getting the remaining hosts for this loop 30529 1726882631.78483: getting the next task for host managed_node1 30529 1726882631.78487: done getting next task for host managed_node1 30529 1726882631.78492: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882631.78496: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.78498: getting variables 30529 1726882631.78499: in VariableManager get_vars() 30529 1726882631.78507: Calling all_inventory to load vars for managed_node1 30529 1726882631.78509: Calling groups_inventory to load vars for managed_node1 30529 1726882631.78511: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.78515: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.78517: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.78519: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.79734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.81358: done with get_vars() 30529 1726882631.81381: done getting variables 30529 1726882631.81435: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882631.81566: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:37:11 -0400 (0:00:00.098) 0:00:45.842 ****** 30529 1726882631.81607: entering _queue_task() for managed_node1/debug 30529 1726882631.81969: worker is 1 (out of 1 available) 30529 1726882631.81980: exiting _queue_task() for managed_node1/debug 30529 1726882631.81997: done queuing things up, now waiting for results queue to drain 30529 1726882631.82001: waiting for pending results... 30529 1726882631.82286: running TaskExecutor() for managed_node1/TASK: TEST: I can remove an existing profile without taking it down 30529 1726882631.82408: in run() - task 12673a56-9f93-b0f1-edc0-000000001005 30529 1726882631.82431: variable 'ansible_search_path' from source: unknown 30529 1726882631.82434: variable 'ansible_search_path' from source: unknown 30529 1726882631.82437: calling self._execute() 30529 1726882631.82567: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.82576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.82580: variable 'omit' from source: magic vars 30529 1726882631.82927: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.82935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.82951: variable 'omit' from source: magic vars 30529 1726882631.82977: variable 'omit' from source: magic vars 30529 1726882631.83110: variable 'lsr_description' from source: include params 30529 1726882631.83148: variable 'omit' from source: magic vars 30529 1726882631.83169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882631.83197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.83223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882631.83245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.83257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.83308: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.83313: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.83316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.83560: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.83567: Set connection var ansible_pipelining to False 30529 1726882631.83570: Set connection var ansible_shell_type to sh 30529 1726882631.83572: Set connection var ansible_timeout to 10 30529 1726882631.83574: Set connection var ansible_connection to ssh 30529 1726882631.83576: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.83578: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.83581: variable 'ansible_connection' from source: unknown 30529 1726882631.83584: variable 'ansible_module_compression' from source: unknown 30529 1726882631.83585: variable 'ansible_shell_type' from source: unknown 30529 1726882631.83587: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.83589: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.83591: variable 'ansible_pipelining' from source: unknown 30529 1726882631.83691: variable 'ansible_timeout' from source: unknown 30529 1726882631.83715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.83904: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.83908: variable 'omit' from source: magic vars 30529 1726882631.83911: starting attempt loop 30529 1726882631.83914: running the handler 30529 1726882631.84122: handler run complete 30529 1726882631.84125: attempt loop complete, returning result 30529 1726882631.84127: _execute() done 30529 1726882631.84128: dumping result to json 30529 1726882631.84133: done dumping result, returning 30529 1726882631.84135: done running TaskExecutor() for managed_node1/TASK: TEST: I can remove an existing profile without taking it down [12673a56-9f93-b0f1-edc0-000000001005] 30529 1726882631.84137: sending task result for task 12673a56-9f93-b0f1-edc0-000000001005 30529 1726882631.84208: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001005 30529 1726882631.84210: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I can remove an existing profile without taking it down ########## 30529 1726882631.84276: no more pending results, returning what we have 30529 1726882631.84279: results queue empty 30529 1726882631.84280: checking for any_errors_fatal 30529 1726882631.84282: done checking for any_errors_fatal 30529 1726882631.84283: checking for max_fail_percentage 30529 1726882631.84284: done checking for max_fail_percentage 30529 1726882631.84285: checking to see if all hosts have failed and the running result is not ok 30529 1726882631.84286: done checking to see if all hosts have failed 30529 1726882631.84286: getting the remaining hosts for this loop 30529 1726882631.84287: done getting the remaining hosts for this loop 30529 1726882631.84295: getting the next task for host managed_node1 30529 1726882631.84301: done getting next task for host managed_node1 30529 1726882631.84303: ^ task is: TASK: Show item 30529 1726882631.84305: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882631.84312: getting variables 30529 1726882631.84313: in VariableManager get_vars() 30529 1726882631.84350: Calling all_inventory to load vars for managed_node1 30529 1726882631.84356: Calling groups_inventory to load vars for managed_node1 30529 1726882631.84360: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882631.84369: Calling all_plugins_play to load vars for managed_node1 30529 1726882631.84371: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882631.84378: Calling groups_plugins_play to load vars for managed_node1 30529 1726882631.85460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882631.86728: done with get_vars() 30529 1726882631.86754: done getting variables 30529 1726882631.86820: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:37:11 -0400 (0:00:00.052) 0:00:45.894 ****** 30529 1726882631.86853: entering _queue_task() for managed_node1/debug 30529 1726882631.87176: worker is 1 (out of 1 available) 30529 1726882631.87190: exiting _queue_task() for managed_node1/debug 30529 1726882631.87206: done queuing things up, now waiting for results queue to drain 30529 1726882631.87207: waiting for pending results... 30529 1726882631.87597: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882631.87672: in run() - task 12673a56-9f93-b0f1-edc0-000000001006 30529 1726882631.87691: variable 'ansible_search_path' from source: unknown 30529 1726882631.87696: variable 'ansible_search_path' from source: unknown 30529 1726882631.87898: variable 'omit' from source: magic vars 30529 1726882631.88502: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.88511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.88515: variable 'omit' from source: magic vars 30529 1726882631.88962: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.88974: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.88981: variable 'omit' from source: magic vars 30529 1726882631.89059: variable 'omit' from source: magic vars 30529 1726882631.89111: variable 'item' from source: unknown 30529 1726882631.89178: variable 'item' from source: unknown 30529 1726882631.89197: variable 'omit' from source: magic vars 30529 1726882631.89326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882631.89429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.89433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882631.89453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.89467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.89491: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.89499: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.89502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.89581: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.89585: Set connection var ansible_pipelining to False 30529 1726882631.89587: Set connection var ansible_shell_type to sh 30529 1726882631.89600: Set connection var ansible_timeout to 10 30529 1726882631.89603: Set connection var ansible_connection to ssh 30529 1726882631.89607: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.89625: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.89628: variable 'ansible_connection' from source: unknown 30529 1726882631.89631: variable 'ansible_module_compression' from source: unknown 30529 1726882631.89633: variable 'ansible_shell_type' from source: unknown 30529 1726882631.89635: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.89637: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.89644: variable 'ansible_pipelining' from source: unknown 30529 1726882631.89647: variable 'ansible_timeout' from source: unknown 30529 1726882631.89649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.89762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.89772: variable 'omit' from source: magic vars 30529 1726882631.89776: starting attempt loop 30529 1726882631.89784: running the handler 30529 1726882631.89819: variable 'lsr_description' from source: include params 30529 1726882631.89880: variable 'lsr_description' from source: include params 30529 1726882631.89892: handler run complete 30529 1726882631.89906: attempt loop complete, returning result 30529 1726882631.89917: variable 'item' from source: unknown 30529 1726882631.89963: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 30529 1726882631.90117: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.90120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.90122: variable 'omit' from source: magic vars 30529 1726882631.90184: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.90188: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.90195: variable 'omit' from source: magic vars 30529 1726882631.90205: variable 'omit' from source: magic vars 30529 1726882631.90240: variable 'item' from source: unknown 30529 1726882631.90277: variable 'item' from source: unknown 30529 1726882631.90288: variable 'omit' from source: magic vars 30529 1726882631.90305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.90312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.90318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.90327: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.90330: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.90332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.90402: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.90405: Set connection var ansible_pipelining to False 30529 1726882631.90411: Set connection var ansible_shell_type to sh 30529 1726882631.90442: Set connection var ansible_timeout to 10 30529 1726882631.90445: Set connection var ansible_connection to ssh 30529 1726882631.90447: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.90484: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.90487: variable 'ansible_connection' from source: unknown 30529 1726882631.90492: variable 'ansible_module_compression' from source: unknown 30529 1726882631.90496: variable 'ansible_shell_type' from source: unknown 30529 1726882631.90498: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.90500: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.90503: variable 'ansible_pipelining' from source: unknown 30529 1726882631.90505: variable 'ansible_timeout' from source: unknown 30529 1726882631.90507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.90577: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.90585: variable 'omit' from source: magic vars 30529 1726882631.90588: starting attempt loop 30529 1726882631.90595: running the handler 30529 1726882631.90597: variable 'lsr_setup' from source: include params 30529 1726882631.90701: variable 'lsr_setup' from source: include params 30529 1726882631.90715: handler run complete 30529 1726882631.90724: attempt loop complete, returning result 30529 1726882631.90736: variable 'item' from source: unknown 30529 1726882631.90788: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 30529 1726882631.90868: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.90871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.90873: variable 'omit' from source: magic vars 30529 1726882631.91083: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.91085: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.91087: variable 'omit' from source: magic vars 30529 1726882631.91092: variable 'omit' from source: magic vars 30529 1726882631.91298: variable 'item' from source: unknown 30529 1726882631.91302: variable 'item' from source: unknown 30529 1726882631.91304: variable 'omit' from source: magic vars 30529 1726882631.91306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.91308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.91310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.91311: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.91313: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.91315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.91353: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.91364: Set connection var ansible_pipelining to False 30529 1726882631.91371: Set connection var ansible_shell_type to sh 30529 1726882631.91385: Set connection var ansible_timeout to 10 30529 1726882631.91397: Set connection var ansible_connection to ssh 30529 1726882631.91407: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.91441: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.91471: variable 'ansible_connection' from source: unknown 30529 1726882631.91549: variable 'ansible_module_compression' from source: unknown 30529 1726882631.91553: variable 'ansible_shell_type' from source: unknown 30529 1726882631.91555: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.91557: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.91559: variable 'ansible_pipelining' from source: unknown 30529 1726882631.91561: variable 'ansible_timeout' from source: unknown 30529 1726882631.91563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.91632: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.91659: variable 'omit' from source: magic vars 30529 1726882631.91670: starting attempt loop 30529 1726882631.91678: running the handler 30529 1726882631.91707: variable 'lsr_test' from source: include params 30529 1726882631.91795: variable 'lsr_test' from source: include params 30529 1726882631.91878: handler run complete 30529 1726882631.91881: attempt loop complete, returning result 30529 1726882631.91884: variable 'item' from source: unknown 30529 1726882631.92101: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 30529 1726882631.92413: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.92417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.92428: variable 'omit' from source: magic vars 30529 1726882631.92681: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.92709: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.92716: variable 'omit' from source: magic vars 30529 1726882631.92730: variable 'omit' from source: magic vars 30529 1726882631.92774: variable 'item' from source: unknown 30529 1726882631.92839: variable 'item' from source: unknown 30529 1726882631.92899: variable 'omit' from source: magic vars 30529 1726882631.92981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.93057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.93158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.93162: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.93164: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.93166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.93382: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.93423: Set connection var ansible_pipelining to False 30529 1726882631.93427: Set connection var ansible_shell_type to sh 30529 1726882631.93499: Set connection var ansible_timeout to 10 30529 1726882631.93503: Set connection var ansible_connection to ssh 30529 1726882631.93505: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.93507: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.93509: variable 'ansible_connection' from source: unknown 30529 1726882631.93528: variable 'ansible_module_compression' from source: unknown 30529 1726882631.93530: variable 'ansible_shell_type' from source: unknown 30529 1726882631.93532: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.93534: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.93648: variable 'ansible_pipelining' from source: unknown 30529 1726882631.93656: variable 'ansible_timeout' from source: unknown 30529 1726882631.93658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.93829: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.93832: variable 'omit' from source: magic vars 30529 1726882631.93834: starting attempt loop 30529 1726882631.93918: running the handler 30529 1726882631.93952: variable 'lsr_assert' from source: include params 30529 1726882631.94207: variable 'lsr_assert' from source: include params 30529 1726882631.94211: handler run complete 30529 1726882631.94213: attempt loop complete, returning result 30529 1726882631.94215: variable 'item' from source: unknown 30529 1726882631.94217: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 30529 1726882631.94621: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.94625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.94627: variable 'omit' from source: magic vars 30529 1726882631.95099: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.95109: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.95112: variable 'omit' from source: magic vars 30529 1726882631.95114: variable 'omit' from source: magic vars 30529 1726882631.95116: variable 'item' from source: unknown 30529 1726882631.95118: variable 'item' from source: unknown 30529 1726882631.95348: variable 'omit' from source: magic vars 30529 1726882631.95352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.95355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.95357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.95359: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.95361: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.95363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.95430: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.95522: Set connection var ansible_pipelining to False 30529 1726882631.95529: Set connection var ansible_shell_type to sh 30529 1726882631.95579: Set connection var ansible_timeout to 10 30529 1726882631.95587: Set connection var ansible_connection to ssh 30529 1726882631.95599: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.95785: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.95789: variable 'ansible_connection' from source: unknown 30529 1726882631.95791: variable 'ansible_module_compression' from source: unknown 30529 1726882631.95794: variable 'ansible_shell_type' from source: unknown 30529 1726882631.95797: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.95799: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.95801: variable 'ansible_pipelining' from source: unknown 30529 1726882631.95803: variable 'ansible_timeout' from source: unknown 30529 1726882631.95805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.96129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.96133: variable 'omit' from source: magic vars 30529 1726882631.96136: starting attempt loop 30529 1726882631.96138: running the handler 30529 1726882631.96383: handler run complete 30529 1726882631.96582: attempt loop complete, returning result 30529 1726882631.96782: variable 'item' from source: unknown 30529 1726882631.97039: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30529 1726882631.97401: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.97406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.97522: variable 'omit' from source: magic vars 30529 1726882631.97649: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.97652: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.97655: variable 'omit' from source: magic vars 30529 1726882631.97657: variable 'omit' from source: magic vars 30529 1726882631.97659: variable 'item' from source: unknown 30529 1726882631.97873: variable 'item' from source: unknown 30529 1726882631.97959: variable 'omit' from source: magic vars 30529 1726882631.98021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882631.98028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.98034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882631.98044: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882631.98047: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.98049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.98523: Set connection var ansible_shell_executable to /bin/sh 30529 1726882631.98527: Set connection var ansible_pipelining to False 30529 1726882631.98529: Set connection var ansible_shell_type to sh 30529 1726882631.98540: Set connection var ansible_timeout to 10 30529 1726882631.98543: Set connection var ansible_connection to ssh 30529 1726882631.98547: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882631.98566: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.98569: variable 'ansible_connection' from source: unknown 30529 1726882631.98572: variable 'ansible_module_compression' from source: unknown 30529 1726882631.98574: variable 'ansible_shell_type' from source: unknown 30529 1726882631.98576: variable 'ansible_shell_executable' from source: unknown 30529 1726882631.98578: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.98622: variable 'ansible_pipelining' from source: unknown 30529 1726882631.98625: variable 'ansible_timeout' from source: unknown 30529 1726882631.98627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.98675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882631.98682: variable 'omit' from source: magic vars 30529 1726882631.98728: starting attempt loop 30529 1726882631.98731: running the handler 30529 1726882631.98944: variable 'lsr_fail_debug' from source: play vars 30529 1726882631.98996: variable 'lsr_fail_debug' from source: play vars 30529 1726882631.99010: handler run complete 30529 1726882631.99033: attempt loop complete, returning result 30529 1726882631.99298: variable 'item' from source: unknown 30529 1726882631.99346: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882631.99565: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882631.99568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882631.99570: variable 'omit' from source: magic vars 30529 1726882631.99804: variable 'ansible_distribution_major_version' from source: facts 30529 1726882631.99808: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882631.99812: variable 'omit' from source: magic vars 30529 1726882631.99814: variable 'omit' from source: magic vars 30529 1726882632.00219: variable 'item' from source: unknown 30529 1726882632.00381: variable 'item' from source: unknown 30529 1726882632.00384: variable 'omit' from source: magic vars 30529 1726882632.00387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882632.00392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.00400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.00403: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882632.00405: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.00407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.00409: Set connection var ansible_shell_executable to /bin/sh 30529 1726882632.00413: Set connection var ansible_pipelining to False 30529 1726882632.00418: Set connection var ansible_shell_type to sh 30529 1726882632.00609: Set connection var ansible_timeout to 10 30529 1726882632.00612: Set connection var ansible_connection to ssh 30529 1726882632.00615: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882632.00636: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.00638: variable 'ansible_connection' from source: unknown 30529 1726882632.00641: variable 'ansible_module_compression' from source: unknown 30529 1726882632.00643: variable 'ansible_shell_type' from source: unknown 30529 1726882632.00645: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.00647: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.00650: variable 'ansible_pipelining' from source: unknown 30529 1726882632.00652: variable 'ansible_timeout' from source: unknown 30529 1726882632.00710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.00736: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882632.00743: variable 'omit' from source: magic vars 30529 1726882632.00746: starting attempt loop 30529 1726882632.00748: running the handler 30529 1726882632.00766: variable 'lsr_cleanup' from source: include params 30529 1726882632.00928: variable 'lsr_cleanup' from source: include params 30529 1726882632.00944: handler run complete 30529 1726882632.00957: attempt loop complete, returning result 30529 1726882632.00969: variable 'item' from source: unknown 30529 1726882632.01033: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30529 1726882632.01182: dumping result to json 30529 1726882632.01184: done dumping result, returning 30529 1726882632.01186: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-000000001006] 30529 1726882632.01188: sending task result for task 12673a56-9f93-b0f1-edc0-000000001006 30529 1726882632.01439: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001006 30529 1726882632.01442: WORKER PROCESS EXITING 30529 1726882632.01557: no more pending results, returning what we have 30529 1726882632.01560: results queue empty 30529 1726882632.01561: checking for any_errors_fatal 30529 1726882632.01567: done checking for any_errors_fatal 30529 1726882632.01567: checking for max_fail_percentage 30529 1726882632.01569: done checking for max_fail_percentage 30529 1726882632.01570: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.01570: done checking to see if all hosts have failed 30529 1726882632.01571: getting the remaining hosts for this loop 30529 1726882632.01574: done getting the remaining hosts for this loop 30529 1726882632.01577: getting the next task for host managed_node1 30529 1726882632.01584: done getting next task for host managed_node1 30529 1726882632.01586: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882632.01588: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.01596: getting variables 30529 1726882632.01598: in VariableManager get_vars() 30529 1726882632.01628: Calling all_inventory to load vars for managed_node1 30529 1726882632.01630: Calling groups_inventory to load vars for managed_node1 30529 1726882632.01634: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.01643: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.01645: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.01648: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.04738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.06382: done with get_vars() 30529 1726882632.06413: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:37:12 -0400 (0:00:00.196) 0:00:46.091 ****** 30529 1726882632.06522: entering _queue_task() for managed_node1/include_tasks 30529 1726882632.07113: worker is 1 (out of 1 available) 30529 1726882632.07123: exiting _queue_task() for managed_node1/include_tasks 30529 1726882632.07134: done queuing things up, now waiting for results queue to drain 30529 1726882632.07136: waiting for pending results... 30529 1726882632.07268: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882632.07474: in run() - task 12673a56-9f93-b0f1-edc0-000000001007 30529 1726882632.07479: variable 'ansible_search_path' from source: unknown 30529 1726882632.07484: variable 'ansible_search_path' from source: unknown 30529 1726882632.07487: calling self._execute() 30529 1726882632.07553: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.07565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.07587: variable 'omit' from source: magic vars 30529 1726882632.07981: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.08006: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.08026: _execute() done 30529 1726882632.08034: dumping result to json 30529 1726882632.08042: done dumping result, returning 30529 1726882632.08052: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-000000001007] 30529 1726882632.08062: sending task result for task 12673a56-9f93-b0f1-edc0-000000001007 30529 1726882632.08278: no more pending results, returning what we have 30529 1726882632.08283: in VariableManager get_vars() 30529 1726882632.08331: Calling all_inventory to load vars for managed_node1 30529 1726882632.08334: Calling groups_inventory to load vars for managed_node1 30529 1726882632.08498: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.08511: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.08514: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.08517: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.09109: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001007 30529 1726882632.09113: WORKER PROCESS EXITING 30529 1726882632.10007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.11608: done with get_vars() 30529 1726882632.11638: variable 'ansible_search_path' from source: unknown 30529 1726882632.11640: variable 'ansible_search_path' from source: unknown 30529 1726882632.11681: we have included files to process 30529 1726882632.11682: generating all_blocks data 30529 1726882632.11685: done generating all_blocks data 30529 1726882632.11694: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882632.11696: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882632.11698: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882632.11813: in VariableManager get_vars() 30529 1726882632.11833: done with get_vars() 30529 1726882632.11959: done processing included file 30529 1726882632.11962: iterating over new_blocks loaded from include file 30529 1726882632.11963: in VariableManager get_vars() 30529 1726882632.11978: done with get_vars() 30529 1726882632.11980: filtering new block on tags 30529 1726882632.12020: done filtering new block on tags 30529 1726882632.12023: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882632.12028: extending task lists for all hosts with included blocks 30529 1726882632.12519: done extending task lists 30529 1726882632.12521: done processing included files 30529 1726882632.12521: results queue empty 30529 1726882632.12522: checking for any_errors_fatal 30529 1726882632.12529: done checking for any_errors_fatal 30529 1726882632.12530: checking for max_fail_percentage 30529 1726882632.12531: done checking for max_fail_percentage 30529 1726882632.12532: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.12533: done checking to see if all hosts have failed 30529 1726882632.12533: getting the remaining hosts for this loop 30529 1726882632.12535: done getting the remaining hosts for this loop 30529 1726882632.12538: getting the next task for host managed_node1 30529 1726882632.12542: done getting next task for host managed_node1 30529 1726882632.12544: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882632.12547: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.12549: getting variables 30529 1726882632.12550: in VariableManager get_vars() 30529 1726882632.12561: Calling all_inventory to load vars for managed_node1 30529 1726882632.12563: Calling groups_inventory to load vars for managed_node1 30529 1726882632.12565: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.12570: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.12573: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.12575: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.15205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.17968: done with get_vars() 30529 1726882632.18000: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:37:12 -0400 (0:00:00.115) 0:00:46.206 ****** 30529 1726882632.18087: entering _queue_task() for managed_node1/include_tasks 30529 1726882632.18531: worker is 1 (out of 1 available) 30529 1726882632.18543: exiting _queue_task() for managed_node1/include_tasks 30529 1726882632.18667: done queuing things up, now waiting for results queue to drain 30529 1726882632.18668: waiting for pending results... 30529 1726882632.18828: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882632.18948: in run() - task 12673a56-9f93-b0f1-edc0-00000000102e 30529 1726882632.18969: variable 'ansible_search_path' from source: unknown 30529 1726882632.18978: variable 'ansible_search_path' from source: unknown 30529 1726882632.19029: calling self._execute() 30529 1726882632.19135: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.19147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.19162: variable 'omit' from source: magic vars 30529 1726882632.19555: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.19575: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.19589: _execute() done 30529 1726882632.19604: dumping result to json 30529 1726882632.19613: done dumping result, returning 30529 1726882632.19625: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-00000000102e] 30529 1726882632.19636: sending task result for task 12673a56-9f93-b0f1-edc0-00000000102e 30529 1726882632.19823: no more pending results, returning what we have 30529 1726882632.19829: in VariableManager get_vars() 30529 1726882632.19868: Calling all_inventory to load vars for managed_node1 30529 1726882632.19871: Calling groups_inventory to load vars for managed_node1 30529 1726882632.19875: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.19896: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.19901: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.20099: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.20942: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000102e 30529 1726882632.20945: WORKER PROCESS EXITING 30529 1726882632.22476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.24087: done with get_vars() 30529 1726882632.24110: variable 'ansible_search_path' from source: unknown 30529 1726882632.24112: variable 'ansible_search_path' from source: unknown 30529 1726882632.24146: we have included files to process 30529 1726882632.24147: generating all_blocks data 30529 1726882632.24149: done generating all_blocks data 30529 1726882632.24150: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882632.24151: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882632.24153: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882632.24532: done processing included file 30529 1726882632.24534: iterating over new_blocks loaded from include file 30529 1726882632.24536: in VariableManager get_vars() 30529 1726882632.24552: done with get_vars() 30529 1726882632.24553: filtering new block on tags 30529 1726882632.24601: done filtering new block on tags 30529 1726882632.24603: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882632.24608: extending task lists for all hosts with included blocks 30529 1726882632.24768: done extending task lists 30529 1726882632.24770: done processing included files 30529 1726882632.24771: results queue empty 30529 1726882632.24771: checking for any_errors_fatal 30529 1726882632.24774: done checking for any_errors_fatal 30529 1726882632.24775: checking for max_fail_percentage 30529 1726882632.24776: done checking for max_fail_percentage 30529 1726882632.24777: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.24778: done checking to see if all hosts have failed 30529 1726882632.24778: getting the remaining hosts for this loop 30529 1726882632.24780: done getting the remaining hosts for this loop 30529 1726882632.24782: getting the next task for host managed_node1 30529 1726882632.24787: done getting next task for host managed_node1 30529 1726882632.24788: ^ task is: TASK: Gather current interface info 30529 1726882632.24792: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.24795: getting variables 30529 1726882632.24797: in VariableManager get_vars() 30529 1726882632.24813: Calling all_inventory to load vars for managed_node1 30529 1726882632.24816: Calling groups_inventory to load vars for managed_node1 30529 1726882632.24819: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.24824: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.24826: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.24829: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.26832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.28701: done with get_vars() 30529 1726882632.28736: done getting variables 30529 1726882632.28781: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:37:12 -0400 (0:00:00.107) 0:00:46.314 ****** 30529 1726882632.28822: entering _queue_task() for managed_node1/command 30529 1726882632.29556: worker is 1 (out of 1 available) 30529 1726882632.29568: exiting _queue_task() for managed_node1/command 30529 1726882632.29580: done queuing things up, now waiting for results queue to drain 30529 1726882632.29582: waiting for pending results... 30529 1726882632.30094: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882632.30227: in run() - task 12673a56-9f93-b0f1-edc0-000000001069 30529 1726882632.30245: variable 'ansible_search_path' from source: unknown 30529 1726882632.30249: variable 'ansible_search_path' from source: unknown 30529 1726882632.30297: calling self._execute() 30529 1726882632.30397: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.30402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.30414: variable 'omit' from source: magic vars 30529 1726882632.30999: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.31004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.31007: variable 'omit' from source: magic vars 30529 1726882632.31011: variable 'omit' from source: magic vars 30529 1726882632.31014: variable 'omit' from source: magic vars 30529 1726882632.31017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882632.31021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882632.31050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882632.31067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.31078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.31111: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882632.31114: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.31116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.31220: Set connection var ansible_shell_executable to /bin/sh 30529 1726882632.31223: Set connection var ansible_pipelining to False 30529 1726882632.31226: Set connection var ansible_shell_type to sh 30529 1726882632.31236: Set connection var ansible_timeout to 10 30529 1726882632.31239: Set connection var ansible_connection to ssh 30529 1726882632.31253: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882632.31275: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.31279: variable 'ansible_connection' from source: unknown 30529 1726882632.31281: variable 'ansible_module_compression' from source: unknown 30529 1726882632.31284: variable 'ansible_shell_type' from source: unknown 30529 1726882632.31286: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.31288: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.31299: variable 'ansible_pipelining' from source: unknown 30529 1726882632.31302: variable 'ansible_timeout' from source: unknown 30529 1726882632.31305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.31554: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882632.31564: variable 'omit' from source: magic vars 30529 1726882632.31696: starting attempt loop 30529 1726882632.31699: running the handler 30529 1726882632.31705: _low_level_execute_command(): starting 30529 1726882632.31721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882632.32672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882632.32730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882632.32742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882632.32752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.32830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.34644: stdout chunk (state=3): >>>/root <<< 30529 1726882632.34686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882632.34744: stderr chunk (state=3): >>><<< 30529 1726882632.34761: stdout chunk (state=3): >>><<< 30529 1726882632.34796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882632.34824: _low_level_execute_command(): starting 30529 1726882632.34838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469 `" && echo ansible-tmp-1726882632.348097-32761-127171258432469="` echo /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469 `" ) && sleep 0' 30529 1726882632.35436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882632.35451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882632.35468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882632.35485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882632.35580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882632.35610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.35748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.37641: stdout chunk (state=3): >>>ansible-tmp-1726882632.348097-32761-127171258432469=/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469 <<< 30529 1726882632.37846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882632.37874: stderr chunk (state=3): >>><<< 30529 1726882632.37910: stdout chunk (state=3): >>><<< 30529 1726882632.37940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882632.348097-32761-127171258432469=/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882632.37991: variable 'ansible_module_compression' from source: unknown 30529 1726882632.38058: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882632.38113: variable 'ansible_facts' from source: unknown 30529 1726882632.38222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py 30529 1726882632.38628: Sending initial data 30529 1726882632.38634: Sent initial data (155 bytes) 30529 1726882632.39708: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882632.39844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882632.39872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.39970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.41470: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882632.41506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882632.41546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpk_qj52sp /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py <<< 30529 1726882632.41556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py" <<< 30529 1726882632.41587: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpk_qj52sp" to remote "/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py" <<< 30529 1726882632.41596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py" <<< 30529 1726882632.42259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882632.42273: stderr chunk (state=3): >>><<< 30529 1726882632.42278: stdout chunk (state=3): >>><<< 30529 1726882632.42318: done transferring module to remote 30529 1726882632.42327: _low_level_execute_command(): starting 30529 1726882632.42332: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/ /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py && sleep 0' 30529 1726882632.42771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882632.42775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882632.42778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882632.42780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882632.42782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882632.42846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882632.42848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.42883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.44584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882632.44615: stderr chunk (state=3): >>><<< 30529 1726882632.44621: stdout chunk (state=3): >>><<< 30529 1726882632.44741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882632.44756: _low_level_execute_command(): starting 30529 1726882632.44761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/AnsiballZ_command.py && sleep 0' 30529 1726882632.45321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882632.45335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882632.45340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882632.45394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882632.45451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882632.45456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.45534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.60681: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:12.602851", "end": "2024-09-20 21:37:12.605848", "delta": "0:00:00.002997", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882632.62100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882632.62127: stderr chunk (state=3): >>><<< 30529 1726882632.62130: stdout chunk (state=3): >>><<< 30529 1726882632.62241: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:12.602851", "end": "2024-09-20 21:37:12.605848", "delta": "0:00:00.002997", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882632.62245: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882632.62247: _low_level_execute_command(): starting 30529 1726882632.62249: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882632.348097-32761-127171258432469/ > /dev/null 2>&1 && sleep 0' 30529 1726882632.62667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882632.62670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882632.62672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882632.62674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882632.62676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882632.62734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882632.62739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882632.62777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882632.64606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882632.64609: stdout chunk (state=3): >>><<< 30529 1726882632.64629: stderr chunk (state=3): >>><<< 30529 1726882632.64633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882632.64642: handler run complete 30529 1726882632.64667: Evaluated conditional (False): False 30529 1726882632.64680: attempt loop complete, returning result 30529 1726882632.64687: _execute() done 30529 1726882632.64734: dumping result to json 30529 1726882632.64741: done dumping result, returning 30529 1726882632.64743: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-000000001069] 30529 1726882632.64746: sending task result for task 12673a56-9f93-b0f1-edc0-000000001069 30529 1726882632.64877: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001069 30529 1726882632.64880: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002997", "end": "2024-09-20 21:37:12.605848", "rc": 0, "start": "2024-09-20 21:37:12.602851" } STDOUT: bonding_masters eth0 lo 30529 1726882632.64960: no more pending results, returning what we have 30529 1726882632.64964: results queue empty 30529 1726882632.64965: checking for any_errors_fatal 30529 1726882632.64967: done checking for any_errors_fatal 30529 1726882632.64967: checking for max_fail_percentage 30529 1726882632.64969: done checking for max_fail_percentage 30529 1726882632.64970: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.64971: done checking to see if all hosts have failed 30529 1726882632.64971: getting the remaining hosts for this loop 30529 1726882632.64973: done getting the remaining hosts for this loop 30529 1726882632.64977: getting the next task for host managed_node1 30529 1726882632.64987: done getting next task for host managed_node1 30529 1726882632.64989: ^ task is: TASK: Set current_interfaces 30529 1726882632.64996: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.65004: getting variables 30529 1726882632.65006: in VariableManager get_vars() 30529 1726882632.65041: Calling all_inventory to load vars for managed_node1 30529 1726882632.65043: Calling groups_inventory to load vars for managed_node1 30529 1726882632.65047: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.65057: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.65060: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.65063: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.66527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.67730: done with get_vars() 30529 1726882632.67747: done getting variables 30529 1726882632.67795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:37:12 -0400 (0:00:00.389) 0:00:46.704 ****** 30529 1726882632.67818: entering _queue_task() for managed_node1/set_fact 30529 1726882632.68051: worker is 1 (out of 1 available) 30529 1726882632.68062: exiting _queue_task() for managed_node1/set_fact 30529 1726882632.68076: done queuing things up, now waiting for results queue to drain 30529 1726882632.68077: waiting for pending results... 30529 1726882632.68259: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882632.68341: in run() - task 12673a56-9f93-b0f1-edc0-00000000106a 30529 1726882632.68353: variable 'ansible_search_path' from source: unknown 30529 1726882632.68358: variable 'ansible_search_path' from source: unknown 30529 1726882632.68384: calling self._execute() 30529 1726882632.68460: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.68464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.68473: variable 'omit' from source: magic vars 30529 1726882632.68741: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.68755: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.68758: variable 'omit' from source: magic vars 30529 1726882632.68795: variable 'omit' from source: magic vars 30529 1726882632.69098: variable '_current_interfaces' from source: set_fact 30529 1726882632.69102: variable 'omit' from source: magic vars 30529 1726882632.69105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882632.69107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882632.69110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882632.69112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.69134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.69169: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882632.69179: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.69449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.69452: Set connection var ansible_shell_executable to /bin/sh 30529 1726882632.69454: Set connection var ansible_pipelining to False 30529 1726882632.69456: Set connection var ansible_shell_type to sh 30529 1726882632.69521: Set connection var ansible_timeout to 10 30529 1726882632.69524: Set connection var ansible_connection to ssh 30529 1726882632.69529: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882632.69557: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.69561: variable 'ansible_connection' from source: unknown 30529 1726882632.69563: variable 'ansible_module_compression' from source: unknown 30529 1726882632.69566: variable 'ansible_shell_type' from source: unknown 30529 1726882632.69568: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.69570: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.69572: variable 'ansible_pipelining' from source: unknown 30529 1726882632.69574: variable 'ansible_timeout' from source: unknown 30529 1726882632.69576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.69703: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882632.69713: variable 'omit' from source: magic vars 30529 1726882632.69719: starting attempt loop 30529 1726882632.69722: running the handler 30529 1726882632.69733: handler run complete 30529 1726882632.69743: attempt loop complete, returning result 30529 1726882632.69746: _execute() done 30529 1726882632.69750: dumping result to json 30529 1726882632.69752: done dumping result, returning 30529 1726882632.69758: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-00000000106a] 30529 1726882632.69760: sending task result for task 12673a56-9f93-b0f1-edc0-00000000106a 30529 1726882632.69867: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000106a 30529 1726882632.69870: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882632.69938: no more pending results, returning what we have 30529 1726882632.69940: results queue empty 30529 1726882632.69941: checking for any_errors_fatal 30529 1726882632.69948: done checking for any_errors_fatal 30529 1726882632.69948: checking for max_fail_percentage 30529 1726882632.69949: done checking for max_fail_percentage 30529 1726882632.69950: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.69951: done checking to see if all hosts have failed 30529 1726882632.69952: getting the remaining hosts for this loop 30529 1726882632.69953: done getting the remaining hosts for this loop 30529 1726882632.69957: getting the next task for host managed_node1 30529 1726882632.69965: done getting next task for host managed_node1 30529 1726882632.69967: ^ task is: TASK: Show current_interfaces 30529 1726882632.69970: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.69973: getting variables 30529 1726882632.69975: in VariableManager get_vars() 30529 1726882632.70006: Calling all_inventory to load vars for managed_node1 30529 1726882632.70009: Calling groups_inventory to load vars for managed_node1 30529 1726882632.70012: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.70022: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.70024: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.70027: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.71371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.73023: done with get_vars() 30529 1726882632.73044: done getting variables 30529 1726882632.73105: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:37:12 -0400 (0:00:00.053) 0:00:46.757 ****** 30529 1726882632.73144: entering _queue_task() for managed_node1/debug 30529 1726882632.73464: worker is 1 (out of 1 available) 30529 1726882632.73479: exiting _queue_task() for managed_node1/debug 30529 1726882632.73500: done queuing things up, now waiting for results queue to drain 30529 1726882632.73502: waiting for pending results... 30529 1726882632.73729: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882632.73811: in run() - task 12673a56-9f93-b0f1-edc0-00000000102f 30529 1726882632.73821: variable 'ansible_search_path' from source: unknown 30529 1726882632.73825: variable 'ansible_search_path' from source: unknown 30529 1726882632.73856: calling self._execute() 30529 1726882632.73928: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.73932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.73941: variable 'omit' from source: magic vars 30529 1726882632.74204: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.74214: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.74225: variable 'omit' from source: magic vars 30529 1726882632.74253: variable 'omit' from source: magic vars 30529 1726882632.74321: variable 'current_interfaces' from source: set_fact 30529 1726882632.74344: variable 'omit' from source: magic vars 30529 1726882632.74373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882632.74402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882632.74419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882632.74433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.74445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882632.74468: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882632.74471: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.74473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.74547: Set connection var ansible_shell_executable to /bin/sh 30529 1726882632.74550: Set connection var ansible_pipelining to False 30529 1726882632.74553: Set connection var ansible_shell_type to sh 30529 1726882632.74563: Set connection var ansible_timeout to 10 30529 1726882632.74565: Set connection var ansible_connection to ssh 30529 1726882632.74570: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882632.74586: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.74592: variable 'ansible_connection' from source: unknown 30529 1726882632.74596: variable 'ansible_module_compression' from source: unknown 30529 1726882632.74598: variable 'ansible_shell_type' from source: unknown 30529 1726882632.74600: variable 'ansible_shell_executable' from source: unknown 30529 1726882632.74604: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.74606: variable 'ansible_pipelining' from source: unknown 30529 1726882632.74608: variable 'ansible_timeout' from source: unknown 30529 1726882632.74610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.74708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882632.74718: variable 'omit' from source: magic vars 30529 1726882632.74722: starting attempt loop 30529 1726882632.74725: running the handler 30529 1726882632.74762: handler run complete 30529 1726882632.74773: attempt loop complete, returning result 30529 1726882632.74776: _execute() done 30529 1726882632.74778: dumping result to json 30529 1726882632.74780: done dumping result, returning 30529 1726882632.74784: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-00000000102f] 30529 1726882632.74796: sending task result for task 12673a56-9f93-b0f1-edc0-00000000102f 30529 1726882632.74872: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000102f 30529 1726882632.74874: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882632.74918: no more pending results, returning what we have 30529 1726882632.74922: results queue empty 30529 1726882632.74923: checking for any_errors_fatal 30529 1726882632.74929: done checking for any_errors_fatal 30529 1726882632.74930: checking for max_fail_percentage 30529 1726882632.74931: done checking for max_fail_percentage 30529 1726882632.74932: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.74933: done checking to see if all hosts have failed 30529 1726882632.74934: getting the remaining hosts for this loop 30529 1726882632.74935: done getting the remaining hosts for this loop 30529 1726882632.74939: getting the next task for host managed_node1 30529 1726882632.74948: done getting next task for host managed_node1 30529 1726882632.74950: ^ task is: TASK: Setup 30529 1726882632.74953: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.74956: getting variables 30529 1726882632.74958: in VariableManager get_vars() 30529 1726882632.74987: Calling all_inventory to load vars for managed_node1 30529 1726882632.74989: Calling groups_inventory to load vars for managed_node1 30529 1726882632.74994: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.75004: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.75007: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.75010: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.80216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.81771: done with get_vars() 30529 1726882632.81806: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:37:12 -0400 (0:00:00.087) 0:00:46.844 ****** 30529 1726882632.81887: entering _queue_task() for managed_node1/include_tasks 30529 1726882632.82261: worker is 1 (out of 1 available) 30529 1726882632.82275: exiting _queue_task() for managed_node1/include_tasks 30529 1726882632.82298: done queuing things up, now waiting for results queue to drain 30529 1726882632.82300: waiting for pending results... 30529 1726882632.82517: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882632.82625: in run() - task 12673a56-9f93-b0f1-edc0-000000001008 30529 1726882632.82637: variable 'ansible_search_path' from source: unknown 30529 1726882632.82642: variable 'ansible_search_path' from source: unknown 30529 1726882632.82699: variable 'lsr_setup' from source: include params 30529 1726882632.82936: variable 'lsr_setup' from source: include params 30529 1726882632.82982: variable 'omit' from source: magic vars 30529 1726882632.83124: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.83153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.83157: variable 'omit' from source: magic vars 30529 1726882632.83395: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.83481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.83485: variable 'item' from source: unknown 30529 1726882632.83487: variable 'item' from source: unknown 30529 1726882632.83510: variable 'item' from source: unknown 30529 1726882632.83575: variable 'item' from source: unknown 30529 1726882632.83704: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.83708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.83710: variable 'omit' from source: magic vars 30529 1726882632.84000: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.84003: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.84006: variable 'item' from source: unknown 30529 1726882632.84008: variable 'item' from source: unknown 30529 1726882632.84010: variable 'item' from source: unknown 30529 1726882632.84012: variable 'item' from source: unknown 30529 1726882632.84066: dumping result to json 30529 1726882632.84069: done dumping result, returning 30529 1726882632.84071: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-000000001008] 30529 1726882632.84073: sending task result for task 12673a56-9f93-b0f1-edc0-000000001008 30529 1726882632.84241: no more pending results, returning what we have 30529 1726882632.84246: in VariableManager get_vars() 30529 1726882632.84284: Calling all_inventory to load vars for managed_node1 30529 1726882632.84286: Calling groups_inventory to load vars for managed_node1 30529 1726882632.84292: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.84299: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001008 30529 1726882632.84304: WORKER PROCESS EXITING 30529 1726882632.84403: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.84407: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.84410: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.85712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.87253: done with get_vars() 30529 1726882632.87272: variable 'ansible_search_path' from source: unknown 30529 1726882632.87274: variable 'ansible_search_path' from source: unknown 30529 1726882632.87319: variable 'ansible_search_path' from source: unknown 30529 1726882632.87320: variable 'ansible_search_path' from source: unknown 30529 1726882632.87348: we have included files to process 30529 1726882632.87349: generating all_blocks data 30529 1726882632.87351: done generating all_blocks data 30529 1726882632.87356: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882632.87357: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882632.87359: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882632.87601: done processing included file 30529 1726882632.87603: iterating over new_blocks loaded from include file 30529 1726882632.87605: in VariableManager get_vars() 30529 1726882632.87621: done with get_vars() 30529 1726882632.87623: filtering new block on tags 30529 1726882632.87657: done filtering new block on tags 30529 1726882632.87659: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node1 => (item=tasks/create_bridge_profile.yml) 30529 1726882632.87664: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882632.87665: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882632.87667: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882632.87763: done processing included file 30529 1726882632.87765: iterating over new_blocks loaded from include file 30529 1726882632.87767: in VariableManager get_vars() 30529 1726882632.87781: done with get_vars() 30529 1726882632.87782: filtering new block on tags 30529 1726882632.87809: done filtering new block on tags 30529 1726882632.87812: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node1 => (item=tasks/activate_profile.yml) 30529 1726882632.87815: extending task lists for all hosts with included blocks 30529 1726882632.88415: done extending task lists 30529 1726882632.88417: done processing included files 30529 1726882632.88418: results queue empty 30529 1726882632.88419: checking for any_errors_fatal 30529 1726882632.88422: done checking for any_errors_fatal 30529 1726882632.88423: checking for max_fail_percentage 30529 1726882632.88424: done checking for max_fail_percentage 30529 1726882632.88425: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.88426: done checking to see if all hosts have failed 30529 1726882632.88427: getting the remaining hosts for this loop 30529 1726882632.88428: done getting the remaining hosts for this loop 30529 1726882632.88431: getting the next task for host managed_node1 30529 1726882632.88435: done getting next task for host managed_node1 30529 1726882632.88437: ^ task is: TASK: Include network role 30529 1726882632.88440: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.88442: getting variables 30529 1726882632.88443: in VariableManager get_vars() 30529 1726882632.88452: Calling all_inventory to load vars for managed_node1 30529 1726882632.88460: Calling groups_inventory to load vars for managed_node1 30529 1726882632.88462: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.88468: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.88470: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.88473: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.89697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.91226: done with get_vars() 30529 1726882632.91244: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:37:12 -0400 (0:00:00.094) 0:00:46.939 ****** 30529 1726882632.91319: entering _queue_task() for managed_node1/include_role 30529 1726882632.91678: worker is 1 (out of 1 available) 30529 1726882632.91692: exiting _queue_task() for managed_node1/include_role 30529 1726882632.91908: done queuing things up, now waiting for results queue to drain 30529 1726882632.91910: waiting for pending results... 30529 1726882632.92009: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882632.92123: in run() - task 12673a56-9f93-b0f1-edc0-00000000108f 30529 1726882632.92136: variable 'ansible_search_path' from source: unknown 30529 1726882632.92139: variable 'ansible_search_path' from source: unknown 30529 1726882632.92174: calling self._execute() 30529 1726882632.92270: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882632.92274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882632.92285: variable 'omit' from source: magic vars 30529 1726882632.92898: variable 'ansible_distribution_major_version' from source: facts 30529 1726882632.92902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882632.92904: _execute() done 30529 1726882632.92906: dumping result to json 30529 1726882632.92908: done dumping result, returning 30529 1726882632.92911: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-00000000108f] 30529 1726882632.92913: sending task result for task 12673a56-9f93-b0f1-edc0-00000000108f 30529 1726882632.92981: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000108f 30529 1726882632.92984: WORKER PROCESS EXITING 30529 1726882632.93022: no more pending results, returning what we have 30529 1726882632.93026: in VariableManager get_vars() 30529 1726882632.93056: Calling all_inventory to load vars for managed_node1 30529 1726882632.93059: Calling groups_inventory to load vars for managed_node1 30529 1726882632.93062: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.93070: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.93073: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.93076: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.94352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882632.95368: done with get_vars() 30529 1726882632.95381: variable 'ansible_search_path' from source: unknown 30529 1726882632.95383: variable 'ansible_search_path' from source: unknown 30529 1726882632.95498: variable 'omit' from source: magic vars 30529 1726882632.95525: variable 'omit' from source: magic vars 30529 1726882632.95534: variable 'omit' from source: magic vars 30529 1726882632.95537: we have included files to process 30529 1726882632.95538: generating all_blocks data 30529 1726882632.95539: done generating all_blocks data 30529 1726882632.95540: processing included file: fedora.linux_system_roles.network 30529 1726882632.95552: in VariableManager get_vars() 30529 1726882632.95560: done with get_vars() 30529 1726882632.95578: in VariableManager get_vars() 30529 1726882632.95587: done with get_vars() 30529 1726882632.95622: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882632.95694: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882632.95743: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882632.96037: in VariableManager get_vars() 30529 1726882632.96061: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882632.98188: iterating over new_blocks loaded from include file 30529 1726882632.98194: in VariableManager get_vars() 30529 1726882632.98211: done with get_vars() 30529 1726882632.98213: filtering new block on tags 30529 1726882632.98516: done filtering new block on tags 30529 1726882632.98520: in VariableManager get_vars() 30529 1726882632.98535: done with get_vars() 30529 1726882632.98537: filtering new block on tags 30529 1726882632.98554: done filtering new block on tags 30529 1726882632.98556: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882632.98561: extending task lists for all hosts with included blocks 30529 1726882632.98731: done extending task lists 30529 1726882632.98732: done processing included files 30529 1726882632.98733: results queue empty 30529 1726882632.98734: checking for any_errors_fatal 30529 1726882632.98737: done checking for any_errors_fatal 30529 1726882632.98737: checking for max_fail_percentage 30529 1726882632.98739: done checking for max_fail_percentage 30529 1726882632.98739: checking to see if all hosts have failed and the running result is not ok 30529 1726882632.98740: done checking to see if all hosts have failed 30529 1726882632.98741: getting the remaining hosts for this loop 30529 1726882632.98742: done getting the remaining hosts for this loop 30529 1726882632.98745: getting the next task for host managed_node1 30529 1726882632.98749: done getting next task for host managed_node1 30529 1726882632.98752: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882632.98755: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882632.98765: getting variables 30529 1726882632.98766: in VariableManager get_vars() 30529 1726882632.98778: Calling all_inventory to load vars for managed_node1 30529 1726882632.98780: Calling groups_inventory to load vars for managed_node1 30529 1726882632.98782: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882632.98787: Calling all_plugins_play to load vars for managed_node1 30529 1726882632.98794: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882632.98798: Calling groups_plugins_play to load vars for managed_node1 30529 1726882632.99922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.02244: done with get_vars() 30529 1726882633.02266: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:13 -0400 (0:00:00.110) 0:00:47.049 ****** 30529 1726882633.02349: entering _queue_task() for managed_node1/include_tasks 30529 1726882633.02708: worker is 1 (out of 1 available) 30529 1726882633.02721: exiting _queue_task() for managed_node1/include_tasks 30529 1726882633.02735: done queuing things up, now waiting for results queue to drain 30529 1726882633.02737: waiting for pending results... 30529 1726882633.02943: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882633.03027: in run() - task 12673a56-9f93-b0f1-edc0-0000000010f5 30529 1726882633.03043: variable 'ansible_search_path' from source: unknown 30529 1726882633.03047: variable 'ansible_search_path' from source: unknown 30529 1726882633.03075: calling self._execute() 30529 1726882633.03150: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.03155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.03161: variable 'omit' from source: magic vars 30529 1726882633.03440: variable 'ansible_distribution_major_version' from source: facts 30529 1726882633.03459: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882633.03464: _execute() done 30529 1726882633.03467: dumping result to json 30529 1726882633.03470: done dumping result, returning 30529 1726882633.03478: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-0000000010f5] 30529 1726882633.03483: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f5 30529 1726882633.03566: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f5 30529 1726882633.03569: WORKER PROCESS EXITING 30529 1726882633.03644: no more pending results, returning what we have 30529 1726882633.03649: in VariableManager get_vars() 30529 1726882633.03684: Calling all_inventory to load vars for managed_node1 30529 1726882633.03687: Calling groups_inventory to load vars for managed_node1 30529 1726882633.03691: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882633.03703: Calling all_plugins_play to load vars for managed_node1 30529 1726882633.03706: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882633.03708: Calling groups_plugins_play to load vars for managed_node1 30529 1726882633.04944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.06305: done with get_vars() 30529 1726882633.06327: variable 'ansible_search_path' from source: unknown 30529 1726882633.06329: variable 'ansible_search_path' from source: unknown 30529 1726882633.06378: we have included files to process 30529 1726882633.06379: generating all_blocks data 30529 1726882633.06382: done generating all_blocks data 30529 1726882633.06387: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882633.06389: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882633.06391: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882633.07064: done processing included file 30529 1726882633.07066: iterating over new_blocks loaded from include file 30529 1726882633.07066: in VariableManager get_vars() 30529 1726882633.07083: done with get_vars() 30529 1726882633.07085: filtering new block on tags 30529 1726882633.07117: done filtering new block on tags 30529 1726882633.07120: in VariableManager get_vars() 30529 1726882633.07133: done with get_vars() 30529 1726882633.07134: filtering new block on tags 30529 1726882633.07159: done filtering new block on tags 30529 1726882633.07161: in VariableManager get_vars() 30529 1726882633.07173: done with get_vars() 30529 1726882633.07174: filtering new block on tags 30529 1726882633.07207: done filtering new block on tags 30529 1726882633.07209: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882633.07217: extending task lists for all hosts with included blocks 30529 1726882633.08660: done extending task lists 30529 1726882633.08661: done processing included files 30529 1726882633.08662: results queue empty 30529 1726882633.08662: checking for any_errors_fatal 30529 1726882633.08664: done checking for any_errors_fatal 30529 1726882633.08665: checking for max_fail_percentage 30529 1726882633.08665: done checking for max_fail_percentage 30529 1726882633.08666: checking to see if all hosts have failed and the running result is not ok 30529 1726882633.08666: done checking to see if all hosts have failed 30529 1726882633.08667: getting the remaining hosts for this loop 30529 1726882633.08668: done getting the remaining hosts for this loop 30529 1726882633.08669: getting the next task for host managed_node1 30529 1726882633.08673: done getting next task for host managed_node1 30529 1726882633.08674: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882633.08677: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882633.08684: getting variables 30529 1726882633.08685: in VariableManager get_vars() 30529 1726882633.08695: Calling all_inventory to load vars for managed_node1 30529 1726882633.08696: Calling groups_inventory to load vars for managed_node1 30529 1726882633.08697: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882633.08701: Calling all_plugins_play to load vars for managed_node1 30529 1726882633.08702: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882633.08704: Calling groups_plugins_play to load vars for managed_node1 30529 1726882633.09506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.10353: done with get_vars() 30529 1726882633.10366: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:13 -0400 (0:00:00.080) 0:00:47.130 ****** 30529 1726882633.10420: entering _queue_task() for managed_node1/setup 30529 1726882633.10671: worker is 1 (out of 1 available) 30529 1726882633.10683: exiting _queue_task() for managed_node1/setup 30529 1726882633.10697: done queuing things up, now waiting for results queue to drain 30529 1726882633.10698: waiting for pending results... 30529 1726882633.10880: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882633.10979: in run() - task 12673a56-9f93-b0f1-edc0-000000001152 30529 1726882633.10990: variable 'ansible_search_path' from source: unknown 30529 1726882633.10995: variable 'ansible_search_path' from source: unknown 30529 1726882633.11027: calling self._execute() 30529 1726882633.11099: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.11103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.11111: variable 'omit' from source: magic vars 30529 1726882633.11376: variable 'ansible_distribution_major_version' from source: facts 30529 1726882633.11386: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882633.11531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882633.13779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882633.13827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882633.13854: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882633.13924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882633.13965: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882633.14019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882633.14063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882633.14073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882633.14106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882633.14116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882633.14202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882633.14220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882633.14240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882633.14267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882633.14275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882633.14394: variable '__network_required_facts' from source: role '' defaults 30529 1726882633.14400: variable 'ansible_facts' from source: unknown 30529 1726882633.14863: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882633.14868: when evaluation is False, skipping this task 30529 1726882633.14871: _execute() done 30529 1726882633.14873: dumping result to json 30529 1726882633.14876: done dumping result, returning 30529 1726882633.14881: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000001152] 30529 1726882633.14884: sending task result for task 12673a56-9f93-b0f1-edc0-000000001152 30529 1726882633.14976: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001152 30529 1726882633.14978: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882633.15037: no more pending results, returning what we have 30529 1726882633.15041: results queue empty 30529 1726882633.15042: checking for any_errors_fatal 30529 1726882633.15044: done checking for any_errors_fatal 30529 1726882633.15044: checking for max_fail_percentage 30529 1726882633.15046: done checking for max_fail_percentage 30529 1726882633.15047: checking to see if all hosts have failed and the running result is not ok 30529 1726882633.15048: done checking to see if all hosts have failed 30529 1726882633.15048: getting the remaining hosts for this loop 30529 1726882633.15050: done getting the remaining hosts for this loop 30529 1726882633.15053: getting the next task for host managed_node1 30529 1726882633.15065: done getting next task for host managed_node1 30529 1726882633.15069: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882633.15074: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882633.15098: getting variables 30529 1726882633.15100: in VariableManager get_vars() 30529 1726882633.15135: Calling all_inventory to load vars for managed_node1 30529 1726882633.15137: Calling groups_inventory to load vars for managed_node1 30529 1726882633.15139: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882633.15148: Calling all_plugins_play to load vars for managed_node1 30529 1726882633.15150: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882633.15158: Calling groups_plugins_play to load vars for managed_node1 30529 1726882633.16150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.17053: done with get_vars() 30529 1726882633.17072: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:13 -0400 (0:00:00.067) 0:00:47.197 ****** 30529 1726882633.17162: entering _queue_task() for managed_node1/stat 30529 1726882633.17429: worker is 1 (out of 1 available) 30529 1726882633.17443: exiting _queue_task() for managed_node1/stat 30529 1726882633.17457: done queuing things up, now waiting for results queue to drain 30529 1726882633.17458: waiting for pending results... 30529 1726882633.17669: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882633.17774: in run() - task 12673a56-9f93-b0f1-edc0-000000001154 30529 1726882633.17788: variable 'ansible_search_path' from source: unknown 30529 1726882633.17791: variable 'ansible_search_path' from source: unknown 30529 1726882633.17822: calling self._execute() 30529 1726882633.17898: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.17902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.17912: variable 'omit' from source: magic vars 30529 1726882633.18284: variable 'ansible_distribution_major_version' from source: facts 30529 1726882633.18288: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882633.18423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882633.18670: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882633.18730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882633.18764: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882633.18808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882633.18883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882633.18936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882633.18939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882633.19055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882633.19069: variable '__network_is_ostree' from source: set_fact 30529 1726882633.19080: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882633.19083: when evaluation is False, skipping this task 30529 1726882633.19087: _execute() done 30529 1726882633.19090: dumping result to json 30529 1726882633.19095: done dumping result, returning 30529 1726882633.19150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000001154] 30529 1726882633.19159: sending task result for task 12673a56-9f93-b0f1-edc0-000000001154 30529 1726882633.19227: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001154 30529 1726882633.19230: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882633.19345: no more pending results, returning what we have 30529 1726882633.19351: results queue empty 30529 1726882633.19353: checking for any_errors_fatal 30529 1726882633.19361: done checking for any_errors_fatal 30529 1726882633.19361: checking for max_fail_percentage 30529 1726882633.19363: done checking for max_fail_percentage 30529 1726882633.19363: checking to see if all hosts have failed and the running result is not ok 30529 1726882633.19364: done checking to see if all hosts have failed 30529 1726882633.19365: getting the remaining hosts for this loop 30529 1726882633.19366: done getting the remaining hosts for this loop 30529 1726882633.19369: getting the next task for host managed_node1 30529 1726882633.19376: done getting next task for host managed_node1 30529 1726882633.19379: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882633.19384: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882633.19402: getting variables 30529 1726882633.19404: in VariableManager get_vars() 30529 1726882633.19441: Calling all_inventory to load vars for managed_node1 30529 1726882633.19444: Calling groups_inventory to load vars for managed_node1 30529 1726882633.19446: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882633.19456: Calling all_plugins_play to load vars for managed_node1 30529 1726882633.19462: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882633.19466: Calling groups_plugins_play to load vars for managed_node1 30529 1726882633.20350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.21399: done with get_vars() 30529 1726882633.21412: done getting variables 30529 1726882633.21449: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:13 -0400 (0:00:00.043) 0:00:47.240 ****** 30529 1726882633.21477: entering _queue_task() for managed_node1/set_fact 30529 1726882633.21756: worker is 1 (out of 1 available) 30529 1726882633.21771: exiting _queue_task() for managed_node1/set_fact 30529 1726882633.21791: done queuing things up, now waiting for results queue to drain 30529 1726882633.21796: waiting for pending results... 30529 1726882633.22212: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882633.22308: in run() - task 12673a56-9f93-b0f1-edc0-000000001155 30529 1726882633.22329: variable 'ansible_search_path' from source: unknown 30529 1726882633.22333: variable 'ansible_search_path' from source: unknown 30529 1726882633.22377: calling self._execute() 30529 1726882633.22474: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.22478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.22492: variable 'omit' from source: magic vars 30529 1726882633.22799: variable 'ansible_distribution_major_version' from source: facts 30529 1726882633.22802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882633.22960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882633.23238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882633.23294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882633.23332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882633.23366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882633.23458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882633.23489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882633.23532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882633.23564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882633.23661: variable '__network_is_ostree' from source: set_fact 30529 1726882633.23673: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882633.23679: when evaluation is False, skipping this task 30529 1726882633.23688: _execute() done 30529 1726882633.23698: dumping result to json 30529 1726882633.23705: done dumping result, returning 30529 1726882633.23726: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000001155] 30529 1726882633.23735: sending task result for task 12673a56-9f93-b0f1-edc0-000000001155 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882633.23949: no more pending results, returning what we have 30529 1726882633.23953: results queue empty 30529 1726882633.23954: checking for any_errors_fatal 30529 1726882633.23961: done checking for any_errors_fatal 30529 1726882633.23962: checking for max_fail_percentage 30529 1726882633.23963: done checking for max_fail_percentage 30529 1726882633.23964: checking to see if all hosts have failed and the running result is not ok 30529 1726882633.23965: done checking to see if all hosts have failed 30529 1726882633.23966: getting the remaining hosts for this loop 30529 1726882633.23968: done getting the remaining hosts for this loop 30529 1726882633.23971: getting the next task for host managed_node1 30529 1726882633.23984: done getting next task for host managed_node1 30529 1726882633.23987: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882633.23996: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882633.24212: getting variables 30529 1726882633.24214: in VariableManager get_vars() 30529 1726882633.24251: Calling all_inventory to load vars for managed_node1 30529 1726882633.24254: Calling groups_inventory to load vars for managed_node1 30529 1726882633.24256: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882633.24262: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001155 30529 1726882633.24265: WORKER PROCESS EXITING 30529 1726882633.24274: Calling all_plugins_play to load vars for managed_node1 30529 1726882633.24277: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882633.24280: Calling groups_plugins_play to load vars for managed_node1 30529 1726882633.25083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882633.25953: done with get_vars() 30529 1726882633.25966: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:13 -0400 (0:00:00.045) 0:00:47.286 ****** 30529 1726882633.26036: entering _queue_task() for managed_node1/service_facts 30529 1726882633.26240: worker is 1 (out of 1 available) 30529 1726882633.26251: exiting _queue_task() for managed_node1/service_facts 30529 1726882633.26263: done queuing things up, now waiting for results queue to drain 30529 1726882633.26265: waiting for pending results... 30529 1726882633.26511: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882633.26665: in run() - task 12673a56-9f93-b0f1-edc0-000000001157 30529 1726882633.26699: variable 'ansible_search_path' from source: unknown 30529 1726882633.26702: variable 'ansible_search_path' from source: unknown 30529 1726882633.26798: calling self._execute() 30529 1726882633.26830: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.26842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.26856: variable 'omit' from source: magic vars 30529 1726882633.27211: variable 'ansible_distribution_major_version' from source: facts 30529 1726882633.27229: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882633.27240: variable 'omit' from source: magic vars 30529 1726882633.27326: variable 'omit' from source: magic vars 30529 1726882633.27362: variable 'omit' from source: magic vars 30529 1726882633.27406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882633.27444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882633.27468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882633.27598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882633.27602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882633.27604: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882633.27607: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.27609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.27669: Set connection var ansible_shell_executable to /bin/sh 30529 1726882633.27680: Set connection var ansible_pipelining to False 30529 1726882633.27688: Set connection var ansible_shell_type to sh 30529 1726882633.27709: Set connection var ansible_timeout to 10 30529 1726882633.27715: Set connection var ansible_connection to ssh 30529 1726882633.27724: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882633.27750: variable 'ansible_shell_executable' from source: unknown 30529 1726882633.27759: variable 'ansible_connection' from source: unknown 30529 1726882633.27769: variable 'ansible_module_compression' from source: unknown 30529 1726882633.27799: variable 'ansible_shell_type' from source: unknown 30529 1726882633.27802: variable 'ansible_shell_executable' from source: unknown 30529 1726882633.27804: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882633.27807: variable 'ansible_pipelining' from source: unknown 30529 1726882633.27809: variable 'ansible_timeout' from source: unknown 30529 1726882633.27811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882633.28009: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882633.28095: variable 'omit' from source: magic vars 30529 1726882633.28098: starting attempt loop 30529 1726882633.28101: running the handler 30529 1726882633.28103: _low_level_execute_command(): starting 30529 1726882633.28105: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882633.28697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.28715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.28728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.28770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882633.28783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882633.28838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882633.30508: stdout chunk (state=3): >>>/root <<< 30529 1726882633.30755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882633.30758: stdout chunk (state=3): >>><<< 30529 1726882633.30761: stderr chunk (state=3): >>><<< 30529 1726882633.30765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882633.30767: _low_level_execute_command(): starting 30529 1726882633.30770: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604 `" && echo ansible-tmp-1726882633.3066928-32805-275143614634604="` echo /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604 `" ) && sleep 0' 30529 1726882633.31174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.31189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882633.31218: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.31252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882633.31264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882633.31313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882633.33173: stdout chunk (state=3): >>>ansible-tmp-1726882633.3066928-32805-275143614634604=/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604 <<< 30529 1726882633.33328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882633.33332: stdout chunk (state=3): >>><<< 30529 1726882633.33334: stderr chunk (state=3): >>><<< 30529 1726882633.33349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882633.3066928-32805-275143614634604=/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882633.33498: variable 'ansible_module_compression' from source: unknown 30529 1726882633.33501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882633.33504: variable 'ansible_facts' from source: unknown 30529 1726882633.33570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py 30529 1726882633.33713: Sending initial data 30529 1726882633.33723: Sent initial data (162 bytes) 30529 1726882633.34191: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.34207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.34219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.34269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882633.34288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882633.34322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882633.35836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882633.35881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882633.35942: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp6yoantf3 /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py <<< 30529 1726882633.35945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py" <<< 30529 1726882633.35976: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp6yoantf3" to remote "/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py" <<< 30529 1726882633.36724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882633.36764: stderr chunk (state=3): >>><<< 30529 1726882633.36769: stdout chunk (state=3): >>><<< 30529 1726882633.36872: done transferring module to remote 30529 1726882633.36875: _low_level_execute_command(): starting 30529 1726882633.36879: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/ /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py && sleep 0' 30529 1726882633.37375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.37418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.37457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882633.37475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882633.37528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882633.39228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882633.39294: stderr chunk (state=3): >>><<< 30529 1726882633.39299: stdout chunk (state=3): >>><<< 30529 1726882633.39382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882633.39385: _low_level_execute_command(): starting 30529 1726882633.39392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/AnsiballZ_service_facts.py && sleep 0' 30529 1726882633.40006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882633.40009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882633.40011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.40013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882633.40015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882633.40067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882633.40083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882633.40134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882634.90707: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882634.90720: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30529 1726882634.90728: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882634.92399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882634.92403: stderr chunk (state=3): >>><<< 30529 1726882634.92405: stdout chunk (state=3): >>><<< 30529 1726882634.92410: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882634.93406: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882634.93423: _low_level_execute_command(): starting 30529 1726882634.93432: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882633.3066928-32805-275143614634604/ > /dev/null 2>&1 && sleep 0' 30529 1726882634.94077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882634.94103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882634.94118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882634.94148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882634.94263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882634.94288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882634.94323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882634.94474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882634.96199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882634.96234: stderr chunk (state=3): >>><<< 30529 1726882634.96250: stdout chunk (state=3): >>><<< 30529 1726882634.96277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882634.96298: handler run complete 30529 1726882634.96599: variable 'ansible_facts' from source: unknown 30529 1726882634.96669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882634.97107: variable 'ansible_facts' from source: unknown 30529 1726882634.97201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882634.97415: attempt loop complete, returning result 30529 1726882634.97419: _execute() done 30529 1726882634.97421: dumping result to json 30529 1726882634.97511: done dumping result, returning 30529 1726882634.97515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000001157] 30529 1726882634.97517: sending task result for task 12673a56-9f93-b0f1-edc0-000000001157 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882634.98850: no more pending results, returning what we have 30529 1726882634.98853: results queue empty 30529 1726882634.98854: checking for any_errors_fatal 30529 1726882634.98856: done checking for any_errors_fatal 30529 1726882634.98857: checking for max_fail_percentage 30529 1726882634.98858: done checking for max_fail_percentage 30529 1726882634.98859: checking to see if all hosts have failed and the running result is not ok 30529 1726882634.98860: done checking to see if all hosts have failed 30529 1726882634.98861: getting the remaining hosts for this loop 30529 1726882634.98862: done getting the remaining hosts for this loop 30529 1726882634.98865: getting the next task for host managed_node1 30529 1726882634.98871: done getting next task for host managed_node1 30529 1726882634.98874: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882634.98881: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882634.98891: getting variables 30529 1726882634.98896: in VariableManager get_vars() 30529 1726882634.98929: Calling all_inventory to load vars for managed_node1 30529 1726882634.98932: Calling groups_inventory to load vars for managed_node1 30529 1726882634.98934: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882634.98943: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001157 30529 1726882634.98946: WORKER PROCESS EXITING 30529 1726882634.98954: Calling all_plugins_play to load vars for managed_node1 30529 1726882634.98957: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882634.98963: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.00094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.01106: done with get_vars() 30529 1726882635.01122: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:15 -0400 (0:00:01.751) 0:00:49.038 ****** 30529 1726882635.01194: entering _queue_task() for managed_node1/package_facts 30529 1726882635.01427: worker is 1 (out of 1 available) 30529 1726882635.01439: exiting _queue_task() for managed_node1/package_facts 30529 1726882635.01453: done queuing things up, now waiting for results queue to drain 30529 1726882635.01454: waiting for pending results... 30529 1726882635.01677: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882635.01897: in run() - task 12673a56-9f93-b0f1-edc0-000000001158 30529 1726882635.01904: variable 'ansible_search_path' from source: unknown 30529 1726882635.01907: variable 'ansible_search_path' from source: unknown 30529 1726882635.01960: calling self._execute() 30529 1726882635.02230: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.02237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.02241: variable 'omit' from source: magic vars 30529 1726882635.03166: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.03201: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882635.03228: variable 'omit' from source: magic vars 30529 1726882635.03308: variable 'omit' from source: magic vars 30529 1726882635.03344: variable 'omit' from source: magic vars 30529 1726882635.03389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882635.03414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882635.03450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882635.03471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882635.03490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882635.03526: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882635.03530: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.03532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.03675: Set connection var ansible_shell_executable to /bin/sh 30529 1726882635.03678: Set connection var ansible_pipelining to False 30529 1726882635.03681: Set connection var ansible_shell_type to sh 30529 1726882635.03684: Set connection var ansible_timeout to 10 30529 1726882635.03686: Set connection var ansible_connection to ssh 30529 1726882635.03691: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882635.03712: variable 'ansible_shell_executable' from source: unknown 30529 1726882635.03715: variable 'ansible_connection' from source: unknown 30529 1726882635.03718: variable 'ansible_module_compression' from source: unknown 30529 1726882635.03721: variable 'ansible_shell_type' from source: unknown 30529 1726882635.03724: variable 'ansible_shell_executable' from source: unknown 30529 1726882635.03726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.03728: variable 'ansible_pipelining' from source: unknown 30529 1726882635.03730: variable 'ansible_timeout' from source: unknown 30529 1726882635.03733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.03957: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882635.03966: variable 'omit' from source: magic vars 30529 1726882635.03972: starting attempt loop 30529 1726882635.03975: running the handler 30529 1726882635.03998: _low_level_execute_command(): starting 30529 1726882635.04007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882635.04566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.04574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882635.04583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.04608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.04612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.04653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.04670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.04719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.06302: stdout chunk (state=3): >>>/root <<< 30529 1726882635.06408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882635.06437: stderr chunk (state=3): >>><<< 30529 1726882635.06439: stdout chunk (state=3): >>><<< 30529 1726882635.06456: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882635.06520: _low_level_execute_command(): starting 30529 1726882635.06525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372 `" && echo ansible-tmp-1726882635.0646074-32913-141744523019372="` echo /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372 `" ) && sleep 0' 30529 1726882635.06902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882635.06905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882635.06908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.06956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.06984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.07009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.07066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.09032: stdout chunk (state=3): >>>ansible-tmp-1726882635.0646074-32913-141744523019372=/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372 <<< 30529 1726882635.09153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882635.09213: stdout chunk (state=3): >>><<< 30529 1726882635.09223: stderr chunk (state=3): >>><<< 30529 1726882635.09417: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882635.0646074-32913-141744523019372=/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882635.09421: variable 'ansible_module_compression' from source: unknown 30529 1726882635.09423: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882635.09513: variable 'ansible_facts' from source: unknown 30529 1726882635.09779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py 30529 1726882635.10042: Sending initial data 30529 1726882635.10045: Sent initial data (162 bytes) 30529 1726882635.10609: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882635.10625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882635.10727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.10808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882635.10812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.10848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.10943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.12772: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882635.12777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882635.12780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpz2cl08zz /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py <<< 30529 1726882635.12784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py" <<< 30529 1726882635.12787: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpz2cl08zz" to remote "/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py" <<< 30529 1726882635.14165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882635.14202: stderr chunk (state=3): >>><<< 30529 1726882635.14206: stdout chunk (state=3): >>><<< 30529 1726882635.14246: done transferring module to remote 30529 1726882635.14256: _low_level_execute_command(): starting 30529 1726882635.14260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/ /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py && sleep 0' 30529 1726882635.14672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.14676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.14678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.14680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.14730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.14733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.14782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.16501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882635.16522: stderr chunk (state=3): >>><<< 30529 1726882635.16525: stdout chunk (state=3): >>><<< 30529 1726882635.16544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882635.16549: _low_level_execute_command(): starting 30529 1726882635.16552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/AnsiballZ_package_facts.py && sleep 0' 30529 1726882635.16951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.16954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.16956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882635.16958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.16960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.17018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.17020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.17059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.60639: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882635.60689: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30529 1726882635.60709: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30529 1726882635.60837: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30529 1726882635.60852: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882635.60856: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882635.60925: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882635.62610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882635.62632: stderr chunk (state=3): >>><<< 30529 1726882635.62636: stdout chunk (state=3): >>><<< 30529 1726882635.62670: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882635.64098: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882635.64107: _low_level_execute_command(): starting 30529 1726882635.64115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882635.0646074-32913-141744523019372/ > /dev/null 2>&1 && sleep 0' 30529 1726882635.64823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882635.64827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882635.64830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882635.64911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882635.64959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882635.64985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882635.66785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882635.66815: stderr chunk (state=3): >>><<< 30529 1726882635.66819: stdout chunk (state=3): >>><<< 30529 1726882635.66834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882635.66842: handler run complete 30529 1726882635.67382: variable 'ansible_facts' from source: unknown 30529 1726882635.68119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.70404: variable 'ansible_facts' from source: unknown 30529 1726882635.70742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.71366: attempt loop complete, returning result 30529 1726882635.71470: _execute() done 30529 1726882635.71473: dumping result to json 30529 1726882635.71566: done dumping result, returning 30529 1726882635.71583: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000001158] 30529 1726882635.71587: sending task result for task 12673a56-9f93-b0f1-edc0-000000001158 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882635.73321: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001158 30529 1726882635.73327: WORKER PROCESS EXITING 30529 1726882635.73338: no more pending results, returning what we have 30529 1726882635.73340: results queue empty 30529 1726882635.73341: checking for any_errors_fatal 30529 1726882635.73346: done checking for any_errors_fatal 30529 1726882635.73346: checking for max_fail_percentage 30529 1726882635.73347: done checking for max_fail_percentage 30529 1726882635.73348: checking to see if all hosts have failed and the running result is not ok 30529 1726882635.73348: done checking to see if all hosts have failed 30529 1726882635.73349: getting the remaining hosts for this loop 30529 1726882635.73350: done getting the remaining hosts for this loop 30529 1726882635.73352: getting the next task for host managed_node1 30529 1726882635.73358: done getting next task for host managed_node1 30529 1726882635.73361: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882635.73365: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882635.73372: getting variables 30529 1726882635.73373: in VariableManager get_vars() 30529 1726882635.73399: Calling all_inventory to load vars for managed_node1 30529 1726882635.73400: Calling groups_inventory to load vars for managed_node1 30529 1726882635.73402: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882635.73408: Calling all_plugins_play to load vars for managed_node1 30529 1726882635.73410: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882635.73412: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.74112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.74982: done with get_vars() 30529 1726882635.75002: done getting variables 30529 1726882635.75043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:15 -0400 (0:00:00.738) 0:00:49.776 ****** 30529 1726882635.75076: entering _queue_task() for managed_node1/debug 30529 1726882635.75439: worker is 1 (out of 1 available) 30529 1726882635.75458: exiting _queue_task() for managed_node1/debug 30529 1726882635.75476: done queuing things up, now waiting for results queue to drain 30529 1726882635.75478: waiting for pending results... 30529 1726882635.75911: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882635.76001: in run() - task 12673a56-9f93-b0f1-edc0-0000000010f6 30529 1726882635.76098: variable 'ansible_search_path' from source: unknown 30529 1726882635.76102: variable 'ansible_search_path' from source: unknown 30529 1726882635.76106: calling self._execute() 30529 1726882635.76184: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.76207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.76230: variable 'omit' from source: magic vars 30529 1726882635.76655: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.76668: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882635.76679: variable 'omit' from source: magic vars 30529 1726882635.76738: variable 'omit' from source: magic vars 30529 1726882635.76860: variable 'network_provider' from source: set_fact 30529 1726882635.76888: variable 'omit' from source: magic vars 30529 1726882635.76933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882635.76960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882635.76976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882635.76989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882635.77010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882635.77032: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882635.77048: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.77052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.77197: Set connection var ansible_shell_executable to /bin/sh 30529 1726882635.77204: Set connection var ansible_pipelining to False 30529 1726882635.77207: Set connection var ansible_shell_type to sh 30529 1726882635.77209: Set connection var ansible_timeout to 10 30529 1726882635.77211: Set connection var ansible_connection to ssh 30529 1726882635.77213: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882635.77231: variable 'ansible_shell_executable' from source: unknown 30529 1726882635.77244: variable 'ansible_connection' from source: unknown 30529 1726882635.77252: variable 'ansible_module_compression' from source: unknown 30529 1726882635.77260: variable 'ansible_shell_type' from source: unknown 30529 1726882635.77274: variable 'ansible_shell_executable' from source: unknown 30529 1726882635.77285: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.77305: variable 'ansible_pipelining' from source: unknown 30529 1726882635.77322: variable 'ansible_timeout' from source: unknown 30529 1726882635.77325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.77422: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882635.77431: variable 'omit' from source: magic vars 30529 1726882635.77451: starting attempt loop 30529 1726882635.77454: running the handler 30529 1726882635.77560: handler run complete 30529 1726882635.77563: attempt loop complete, returning result 30529 1726882635.77566: _execute() done 30529 1726882635.77568: dumping result to json 30529 1726882635.77570: done dumping result, returning 30529 1726882635.77573: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-0000000010f6] 30529 1726882635.77575: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f6 30529 1726882635.77667: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f6 30529 1726882635.77670: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882635.77751: no more pending results, returning what we have 30529 1726882635.77754: results queue empty 30529 1726882635.77756: checking for any_errors_fatal 30529 1726882635.77767: done checking for any_errors_fatal 30529 1726882635.77768: checking for max_fail_percentage 30529 1726882635.77769: done checking for max_fail_percentage 30529 1726882635.77770: checking to see if all hosts have failed and the running result is not ok 30529 1726882635.77771: done checking to see if all hosts have failed 30529 1726882635.77771: getting the remaining hosts for this loop 30529 1726882635.77773: done getting the remaining hosts for this loop 30529 1726882635.77776: getting the next task for host managed_node1 30529 1726882635.77787: done getting next task for host managed_node1 30529 1726882635.77790: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882635.77797: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882635.77808: getting variables 30529 1726882635.77810: in VariableManager get_vars() 30529 1726882635.77840: Calling all_inventory to load vars for managed_node1 30529 1726882635.77842: Calling groups_inventory to load vars for managed_node1 30529 1726882635.77844: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882635.77853: Calling all_plugins_play to load vars for managed_node1 30529 1726882635.77855: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882635.77857: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.78741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.79791: done with get_vars() 30529 1726882635.79808: done getting variables 30529 1726882635.79846: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:15 -0400 (0:00:00.047) 0:00:49.824 ****** 30529 1726882635.79873: entering _queue_task() for managed_node1/fail 30529 1726882635.80085: worker is 1 (out of 1 available) 30529 1726882635.80102: exiting _queue_task() for managed_node1/fail 30529 1726882635.80115: done queuing things up, now waiting for results queue to drain 30529 1726882635.80117: waiting for pending results... 30529 1726882635.80309: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882635.80380: in run() - task 12673a56-9f93-b0f1-edc0-0000000010f7 30529 1726882635.80396: variable 'ansible_search_path' from source: unknown 30529 1726882635.80401: variable 'ansible_search_path' from source: unknown 30529 1726882635.80426: calling self._execute() 30529 1726882635.80501: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.80504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.80513: variable 'omit' from source: magic vars 30529 1726882635.80832: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.80874: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882635.80958: variable 'network_state' from source: role '' defaults 30529 1726882635.80967: Evaluated conditional (network_state != {}): False 30529 1726882635.80971: when evaluation is False, skipping this task 30529 1726882635.80973: _execute() done 30529 1726882635.80976: dumping result to json 30529 1726882635.80978: done dumping result, returning 30529 1726882635.80985: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-0000000010f7] 30529 1726882635.81000: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f7 30529 1726882635.81107: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f7 30529 1726882635.81111: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882635.81205: no more pending results, returning what we have 30529 1726882635.81209: results queue empty 30529 1726882635.81210: checking for any_errors_fatal 30529 1726882635.81217: done checking for any_errors_fatal 30529 1726882635.81218: checking for max_fail_percentage 30529 1726882635.81219: done checking for max_fail_percentage 30529 1726882635.81220: checking to see if all hosts have failed and the running result is not ok 30529 1726882635.81221: done checking to see if all hosts have failed 30529 1726882635.81222: getting the remaining hosts for this loop 30529 1726882635.81223: done getting the remaining hosts for this loop 30529 1726882635.81226: getting the next task for host managed_node1 30529 1726882635.81234: done getting next task for host managed_node1 30529 1726882635.81237: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882635.81242: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882635.81258: getting variables 30529 1726882635.81259: in VariableManager get_vars() 30529 1726882635.81292: Calling all_inventory to load vars for managed_node1 30529 1726882635.81296: Calling groups_inventory to load vars for managed_node1 30529 1726882635.81298: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882635.81307: Calling all_plugins_play to load vars for managed_node1 30529 1726882635.81310: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882635.81312: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.82245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.83552: done with get_vars() 30529 1726882635.83566: done getting variables 30529 1726882635.83608: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:15 -0400 (0:00:00.037) 0:00:49.862 ****** 30529 1726882635.83633: entering _queue_task() for managed_node1/fail 30529 1726882635.83816: worker is 1 (out of 1 available) 30529 1726882635.83829: exiting _queue_task() for managed_node1/fail 30529 1726882635.83841: done queuing things up, now waiting for results queue to drain 30529 1726882635.83842: waiting for pending results... 30529 1726882635.84056: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882635.84139: in run() - task 12673a56-9f93-b0f1-edc0-0000000010f8 30529 1726882635.84149: variable 'ansible_search_path' from source: unknown 30529 1726882635.84152: variable 'ansible_search_path' from source: unknown 30529 1726882635.84180: calling self._execute() 30529 1726882635.84252: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.84256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.84265: variable 'omit' from source: magic vars 30529 1726882635.84653: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.84656: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882635.84727: variable 'network_state' from source: role '' defaults 30529 1726882635.84735: Evaluated conditional (network_state != {}): False 30529 1726882635.84738: when evaluation is False, skipping this task 30529 1726882635.84749: _execute() done 30529 1726882635.84753: dumping result to json 30529 1726882635.84756: done dumping result, returning 30529 1726882635.84759: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-0000000010f8] 30529 1726882635.84761: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f8 30529 1726882635.84864: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f8 30529 1726882635.84870: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882635.84931: no more pending results, returning what we have 30529 1726882635.84934: results queue empty 30529 1726882635.84935: checking for any_errors_fatal 30529 1726882635.84939: done checking for any_errors_fatal 30529 1726882635.84940: checking for max_fail_percentage 30529 1726882635.84942: done checking for max_fail_percentage 30529 1726882635.84942: checking to see if all hosts have failed and the running result is not ok 30529 1726882635.84943: done checking to see if all hosts have failed 30529 1726882635.84944: getting the remaining hosts for this loop 30529 1726882635.84945: done getting the remaining hosts for this loop 30529 1726882635.84948: getting the next task for host managed_node1 30529 1726882635.84958: done getting next task for host managed_node1 30529 1726882635.84964: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882635.84971: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882635.84998: getting variables 30529 1726882635.85002: in VariableManager get_vars() 30529 1726882635.85040: Calling all_inventory to load vars for managed_node1 30529 1726882635.85043: Calling groups_inventory to load vars for managed_node1 30529 1726882635.85045: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882635.85056: Calling all_plugins_play to load vars for managed_node1 30529 1726882635.85059: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882635.85063: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.86241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882635.88381: done with get_vars() 30529 1726882635.88411: done getting variables 30529 1726882635.88454: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:15 -0400 (0:00:00.048) 0:00:49.910 ****** 30529 1726882635.88477: entering _queue_task() for managed_node1/fail 30529 1726882635.88706: worker is 1 (out of 1 available) 30529 1726882635.88719: exiting _queue_task() for managed_node1/fail 30529 1726882635.88732: done queuing things up, now waiting for results queue to drain 30529 1726882635.88734: waiting for pending results... 30529 1726882635.88922: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882635.89007: in run() - task 12673a56-9f93-b0f1-edc0-0000000010f9 30529 1726882635.89018: variable 'ansible_search_path' from source: unknown 30529 1726882635.89024: variable 'ansible_search_path' from source: unknown 30529 1726882635.89056: calling self._execute() 30529 1726882635.89130: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882635.89133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882635.89142: variable 'omit' from source: magic vars 30529 1726882635.89414: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.89424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882635.89564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882635.91822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882635.91945: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882635.91985: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882635.92053: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882635.92076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882635.92197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882635.92279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882635.92313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882635.92379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882635.92388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882635.92547: variable 'ansible_distribution_major_version' from source: facts 30529 1726882635.92582: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882635.92799: variable 'ansible_distribution' from source: facts 30529 1726882635.92835: variable '__network_rh_distros' from source: role '' defaults 30529 1726882635.92853: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882635.93462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882635.93495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882635.93552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882635.93850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882635.93853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882635.93856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882635.93886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882635.93920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882635.93962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882635.94188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882635.94194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882635.94198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882635.94200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882635.94338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882635.94357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882635.94895: variable 'network_connections' from source: include params 30529 1726882635.94913: variable 'interface' from source: play vars 30529 1726882635.94987: variable 'interface' from source: play vars 30529 1726882635.95010: variable 'network_state' from source: role '' defaults 30529 1726882635.95078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882635.95282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882635.95343: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882635.95496: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882635.95500: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882635.95503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882635.95505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882635.95538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882635.95586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882635.95638: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882635.95646: when evaluation is False, skipping this task 30529 1726882635.95668: _execute() done 30529 1726882635.95676: dumping result to json 30529 1726882635.95683: done dumping result, returning 30529 1726882635.95711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-0000000010f9] 30529 1726882635.95720: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f9 30529 1726882635.96018: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010f9 30529 1726882635.96022: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882635.96072: no more pending results, returning what we have 30529 1726882635.96076: results queue empty 30529 1726882635.96077: checking for any_errors_fatal 30529 1726882635.96085: done checking for any_errors_fatal 30529 1726882635.96086: checking for max_fail_percentage 30529 1726882635.96088: done checking for max_fail_percentage 30529 1726882635.96095: checking to see if all hosts have failed and the running result is not ok 30529 1726882635.96096: done checking to see if all hosts have failed 30529 1726882635.96097: getting the remaining hosts for this loop 30529 1726882635.96099: done getting the remaining hosts for this loop 30529 1726882635.96103: getting the next task for host managed_node1 30529 1726882635.96112: done getting next task for host managed_node1 30529 1726882635.96116: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882635.96121: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882635.96144: getting variables 30529 1726882635.96149: in VariableManager get_vars() 30529 1726882635.96192: Calling all_inventory to load vars for managed_node1 30529 1726882635.96328: Calling groups_inventory to load vars for managed_node1 30529 1726882635.96332: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882635.96343: Calling all_plugins_play to load vars for managed_node1 30529 1726882635.96346: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882635.96350: Calling groups_plugins_play to load vars for managed_node1 30529 1726882635.98008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.04404: done with get_vars() 30529 1726882636.04431: done getting variables 30529 1726882636.04492: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:16 -0400 (0:00:00.160) 0:00:50.071 ****** 30529 1726882636.04531: entering _queue_task() for managed_node1/dnf 30529 1726882636.05125: worker is 1 (out of 1 available) 30529 1726882636.05133: exiting _queue_task() for managed_node1/dnf 30529 1726882636.05149: done queuing things up, now waiting for results queue to drain 30529 1726882636.05151: waiting for pending results... 30529 1726882636.05242: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882636.05426: in run() - task 12673a56-9f93-b0f1-edc0-0000000010fa 30529 1726882636.05453: variable 'ansible_search_path' from source: unknown 30529 1726882636.05462: variable 'ansible_search_path' from source: unknown 30529 1726882636.05507: calling self._execute() 30529 1726882636.05626: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.05668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.05696: variable 'omit' from source: magic vars 30529 1726882636.06115: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.06125: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.06308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.08526: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.08590: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.08616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.08643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.08662: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.08727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.08748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.08765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.08790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.08812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.08887: variable 'ansible_distribution' from source: facts 30529 1726882636.08890: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.08907: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882636.08983: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.09078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.09109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.09159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.09274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.09278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.09281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.09283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.09311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.09498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.09502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.09505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.09507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.09510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.09512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.09514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.09650: variable 'network_connections' from source: include params 30529 1726882636.09661: variable 'interface' from source: play vars 30529 1726882636.09725: variable 'interface' from source: play vars 30529 1726882636.09791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882636.09960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882636.10009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882636.10033: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882636.10070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882636.10107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882636.10128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882636.10164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.10181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882636.10244: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882636.10398: variable 'network_connections' from source: include params 30529 1726882636.10401: variable 'interface' from source: play vars 30529 1726882636.10450: variable 'interface' from source: play vars 30529 1726882636.10473: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882636.10476: when evaluation is False, skipping this task 30529 1726882636.10479: _execute() done 30529 1726882636.10482: dumping result to json 30529 1726882636.10484: done dumping result, returning 30529 1726882636.10491: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000010fa] 30529 1726882636.10501: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fa 30529 1726882636.10589: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fa 30529 1726882636.10592: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882636.10651: no more pending results, returning what we have 30529 1726882636.10655: results queue empty 30529 1726882636.10656: checking for any_errors_fatal 30529 1726882636.10665: done checking for any_errors_fatal 30529 1726882636.10666: checking for max_fail_percentage 30529 1726882636.10667: done checking for max_fail_percentage 30529 1726882636.10668: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.10669: done checking to see if all hosts have failed 30529 1726882636.10670: getting the remaining hosts for this loop 30529 1726882636.10671: done getting the remaining hosts for this loop 30529 1726882636.10675: getting the next task for host managed_node1 30529 1726882636.10684: done getting next task for host managed_node1 30529 1726882636.10687: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882636.10692: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.10714: getting variables 30529 1726882636.10717: in VariableManager get_vars() 30529 1726882636.10753: Calling all_inventory to load vars for managed_node1 30529 1726882636.10755: Calling groups_inventory to load vars for managed_node1 30529 1726882636.10757: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.10767: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.10769: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.10772: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.11592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.12569: done with get_vars() 30529 1726882636.12591: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882636.12685: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:16 -0400 (0:00:00.081) 0:00:50.153 ****** 30529 1726882636.12736: entering _queue_task() for managed_node1/yum 30529 1726882636.13072: worker is 1 (out of 1 available) 30529 1726882636.13091: exiting _queue_task() for managed_node1/yum 30529 1726882636.13106: done queuing things up, now waiting for results queue to drain 30529 1726882636.13110: waiting for pending results... 30529 1726882636.13377: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882636.13639: in run() - task 12673a56-9f93-b0f1-edc0-0000000010fb 30529 1726882636.13644: variable 'ansible_search_path' from source: unknown 30529 1726882636.13646: variable 'ansible_search_path' from source: unknown 30529 1726882636.13649: calling self._execute() 30529 1726882636.13737: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.13999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.14003: variable 'omit' from source: magic vars 30529 1726882636.14996: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.15073: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.15539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.18827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.19151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.19178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.19207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.19229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.19285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.19310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.19326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.19356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.19364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.19431: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.19444: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882636.19447: when evaluation is False, skipping this task 30529 1726882636.19449: _execute() done 30529 1726882636.19452: dumping result to json 30529 1726882636.19454: done dumping result, returning 30529 1726882636.19463: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000010fb] 30529 1726882636.19467: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fb 30529 1726882636.19555: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fb 30529 1726882636.19558: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882636.19627: no more pending results, returning what we have 30529 1726882636.19630: results queue empty 30529 1726882636.19631: checking for any_errors_fatal 30529 1726882636.19637: done checking for any_errors_fatal 30529 1726882636.19638: checking for max_fail_percentage 30529 1726882636.19639: done checking for max_fail_percentage 30529 1726882636.19640: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.19641: done checking to see if all hosts have failed 30529 1726882636.19642: getting the remaining hosts for this loop 30529 1726882636.19644: done getting the remaining hosts for this loop 30529 1726882636.19648: getting the next task for host managed_node1 30529 1726882636.19656: done getting next task for host managed_node1 30529 1726882636.19660: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882636.19664: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.19685: getting variables 30529 1726882636.19686: in VariableManager get_vars() 30529 1726882636.19720: Calling all_inventory to load vars for managed_node1 30529 1726882636.19723: Calling groups_inventory to load vars for managed_node1 30529 1726882636.19725: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.19734: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.19737: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.19739: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.21185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.22465: done with get_vars() 30529 1726882636.22480: done getting variables 30529 1726882636.22547: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:16 -0400 (0:00:00.098) 0:00:50.252 ****** 30529 1726882636.22591: entering _queue_task() for managed_node1/fail 30529 1726882636.22839: worker is 1 (out of 1 available) 30529 1726882636.22853: exiting _queue_task() for managed_node1/fail 30529 1726882636.22866: done queuing things up, now waiting for results queue to drain 30529 1726882636.22867: waiting for pending results... 30529 1726882636.23067: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882636.23162: in run() - task 12673a56-9f93-b0f1-edc0-0000000010fc 30529 1726882636.23172: variable 'ansible_search_path' from source: unknown 30529 1726882636.23176: variable 'ansible_search_path' from source: unknown 30529 1726882636.23209: calling self._execute() 30529 1726882636.23284: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.23288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.23303: variable 'omit' from source: magic vars 30529 1726882636.23580: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.23590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.23694: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.23825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.25661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.25816: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.25820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.25827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.25830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.25874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.25920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.25946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.25977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.25988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.26037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.26058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.26081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.26120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.26133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.26198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.26201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.26220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.26257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.26296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.26433: variable 'network_connections' from source: include params 30529 1726882636.26528: variable 'interface' from source: play vars 30529 1726882636.26531: variable 'interface' from source: play vars 30529 1726882636.26595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882636.26816: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882636.26863: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882636.26866: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882636.26898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882636.26973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882636.26977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882636.26983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.27031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882636.27085: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882636.27325: variable 'network_connections' from source: include params 30529 1726882636.27328: variable 'interface' from source: play vars 30529 1726882636.27404: variable 'interface' from source: play vars 30529 1726882636.27438: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882636.27442: when evaluation is False, skipping this task 30529 1726882636.27445: _execute() done 30529 1726882636.27447: dumping result to json 30529 1726882636.27450: done dumping result, returning 30529 1726882636.27455: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000010fc] 30529 1726882636.27457: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fc skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882636.27717: no more pending results, returning what we have 30529 1726882636.27721: results queue empty 30529 1726882636.27722: checking for any_errors_fatal 30529 1726882636.27733: done checking for any_errors_fatal 30529 1726882636.27733: checking for max_fail_percentage 30529 1726882636.27735: done checking for max_fail_percentage 30529 1726882636.27736: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.27737: done checking to see if all hosts have failed 30529 1726882636.27737: getting the remaining hosts for this loop 30529 1726882636.27739: done getting the remaining hosts for this loop 30529 1726882636.27742: getting the next task for host managed_node1 30529 1726882636.27752: done getting next task for host managed_node1 30529 1726882636.27755: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882636.27760: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.27779: getting variables 30529 1726882636.27780: in VariableManager get_vars() 30529 1726882636.27817: Calling all_inventory to load vars for managed_node1 30529 1726882636.27820: Calling groups_inventory to load vars for managed_node1 30529 1726882636.27822: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.27831: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.27833: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.27835: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.28441: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fc 30529 1726882636.28445: WORKER PROCESS EXITING 30529 1726882636.28717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.30575: done with get_vars() 30529 1726882636.30599: done getting variables 30529 1726882636.30651: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:16 -0400 (0:00:00.080) 0:00:50.332 ****** 30529 1726882636.30685: entering _queue_task() for managed_node1/package 30529 1726882636.31049: worker is 1 (out of 1 available) 30529 1726882636.31063: exiting _queue_task() for managed_node1/package 30529 1726882636.31075: done queuing things up, now waiting for results queue to drain 30529 1726882636.31076: waiting for pending results... 30529 1726882636.31416: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882636.31841: in run() - task 12673a56-9f93-b0f1-edc0-0000000010fd 30529 1726882636.31845: variable 'ansible_search_path' from source: unknown 30529 1726882636.31848: variable 'ansible_search_path' from source: unknown 30529 1726882636.31851: calling self._execute() 30529 1726882636.31853: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.31855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.31857: variable 'omit' from source: magic vars 30529 1726882636.32313: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.32330: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.32699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882636.33043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882636.33102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882636.33140: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882636.33210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882636.33288: variable 'network_packages' from source: role '' defaults 30529 1726882636.33365: variable '__network_provider_setup' from source: role '' defaults 30529 1726882636.33374: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882636.33421: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882636.33429: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882636.33473: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882636.33587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.36029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.36032: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.36035: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.36063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.36095: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.36178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.36216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.36253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.36300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.36321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.36374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.36401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.36428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.36537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.36541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.36773: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882636.36888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.36927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.36957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.37007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.37031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.37198: variable 'ansible_python' from source: facts 30529 1726882636.37202: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882636.37241: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882636.37325: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882636.37473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.37515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.37544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.37597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.37618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.37672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.37774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.37777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.37785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.37813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.37966: variable 'network_connections' from source: include params 30529 1726882636.37977: variable 'interface' from source: play vars 30529 1726882636.38113: variable 'interface' from source: play vars 30529 1726882636.38177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882636.38227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882636.38262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.38602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882636.38605: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.38976: variable 'network_connections' from source: include params 30529 1726882636.39044: variable 'interface' from source: play vars 30529 1726882636.39259: variable 'interface' from source: play vars 30529 1726882636.39348: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882636.39545: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.40029: variable 'network_connections' from source: include params 30529 1726882636.40032: variable 'interface' from source: play vars 30529 1726882636.40120: variable 'interface' from source: play vars 30529 1726882636.40146: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882636.40253: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882636.40638: variable 'network_connections' from source: include params 30529 1726882636.40641: variable 'interface' from source: play vars 30529 1726882636.40740: variable 'interface' from source: play vars 30529 1726882636.40809: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882636.40917: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882636.40924: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882636.40998: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882636.41357: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882636.42276: variable 'network_connections' from source: include params 30529 1726882636.42300: variable 'interface' from source: play vars 30529 1726882636.42372: variable 'interface' from source: play vars 30529 1726882636.42376: variable 'ansible_distribution' from source: facts 30529 1726882636.42378: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.42381: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.42414: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882636.42605: variable 'ansible_distribution' from source: facts 30529 1726882636.42608: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.42610: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.42613: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882636.42768: variable 'ansible_distribution' from source: facts 30529 1726882636.42776: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.42778: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.42780: variable 'network_provider' from source: set_fact 30529 1726882636.42782: variable 'ansible_facts' from source: unknown 30529 1726882636.43436: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882636.43439: when evaluation is False, skipping this task 30529 1726882636.43441: _execute() done 30529 1726882636.43443: dumping result to json 30529 1726882636.43447: done dumping result, returning 30529 1726882636.43450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-0000000010fd] 30529 1726882636.43452: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fd skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882636.43781: no more pending results, returning what we have 30529 1726882636.43805: results queue empty 30529 1726882636.43806: checking for any_errors_fatal 30529 1726882636.43812: done checking for any_errors_fatal 30529 1726882636.43813: checking for max_fail_percentage 30529 1726882636.43814: done checking for max_fail_percentage 30529 1726882636.43815: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.43816: done checking to see if all hosts have failed 30529 1726882636.43817: getting the remaining hosts for this loop 30529 1726882636.43818: done getting the remaining hosts for this loop 30529 1726882636.43822: getting the next task for host managed_node1 30529 1726882636.43830: done getting next task for host managed_node1 30529 1726882636.43834: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882636.43839: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.43849: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fd 30529 1726882636.43856: WORKER PROCESS EXITING 30529 1726882636.43869: getting variables 30529 1726882636.43871: in VariableManager get_vars() 30529 1726882636.43926: Calling all_inventory to load vars for managed_node1 30529 1726882636.43941: Calling groups_inventory to load vars for managed_node1 30529 1726882636.43956: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.43970: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.43973: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.43976: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.45303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.46606: done with get_vars() 30529 1726882636.46621: done getting variables 30529 1726882636.46672: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:16 -0400 (0:00:00.160) 0:00:50.493 ****** 30529 1726882636.46704: entering _queue_task() for managed_node1/package 30529 1726882636.47011: worker is 1 (out of 1 available) 30529 1726882636.47022: exiting _queue_task() for managed_node1/package 30529 1726882636.47034: done queuing things up, now waiting for results queue to drain 30529 1726882636.47035: waiting for pending results... 30529 1726882636.47326: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882636.47501: in run() - task 12673a56-9f93-b0f1-edc0-0000000010fe 30529 1726882636.47506: variable 'ansible_search_path' from source: unknown 30529 1726882636.47509: variable 'ansible_search_path' from source: unknown 30529 1726882636.47532: calling self._execute() 30529 1726882636.47699: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.47703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.47705: variable 'omit' from source: magic vars 30529 1726882636.48007: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.48026: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.48148: variable 'network_state' from source: role '' defaults 30529 1726882636.48159: Evaluated conditional (network_state != {}): False 30529 1726882636.48162: when evaluation is False, skipping this task 30529 1726882636.48165: _execute() done 30529 1726882636.48168: dumping result to json 30529 1726882636.48170: done dumping result, returning 30529 1726882636.48198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000010fe] 30529 1726882636.48202: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fe 30529 1726882636.48300: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010fe 30529 1726882636.48308: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882636.48359: no more pending results, returning what we have 30529 1726882636.48363: results queue empty 30529 1726882636.48364: checking for any_errors_fatal 30529 1726882636.48370: done checking for any_errors_fatal 30529 1726882636.48370: checking for max_fail_percentage 30529 1726882636.48372: done checking for max_fail_percentage 30529 1726882636.48373: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.48373: done checking to see if all hosts have failed 30529 1726882636.48374: getting the remaining hosts for this loop 30529 1726882636.48376: done getting the remaining hosts for this loop 30529 1726882636.48379: getting the next task for host managed_node1 30529 1726882636.48387: done getting next task for host managed_node1 30529 1726882636.48394: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882636.48399: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.48430: getting variables 30529 1726882636.48432: in VariableManager get_vars() 30529 1726882636.48462: Calling all_inventory to load vars for managed_node1 30529 1726882636.48464: Calling groups_inventory to load vars for managed_node1 30529 1726882636.48466: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.48474: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.48476: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.48479: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.49224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.50473: done with get_vars() 30529 1726882636.50492: done getting variables 30529 1726882636.50541: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:16 -0400 (0:00:00.038) 0:00:50.531 ****** 30529 1726882636.50577: entering _queue_task() for managed_node1/package 30529 1726882636.50849: worker is 1 (out of 1 available) 30529 1726882636.50862: exiting _queue_task() for managed_node1/package 30529 1726882636.50874: done queuing things up, now waiting for results queue to drain 30529 1726882636.50875: waiting for pending results... 30529 1726882636.51311: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882636.51325: in run() - task 12673a56-9f93-b0f1-edc0-0000000010ff 30529 1726882636.51343: variable 'ansible_search_path' from source: unknown 30529 1726882636.51350: variable 'ansible_search_path' from source: unknown 30529 1726882636.51387: calling self._execute() 30529 1726882636.51482: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.51495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.51512: variable 'omit' from source: magic vars 30529 1726882636.51882: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.51896: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.51982: variable 'network_state' from source: role '' defaults 30529 1726882636.51995: Evaluated conditional (network_state != {}): False 30529 1726882636.51998: when evaluation is False, skipping this task 30529 1726882636.52001: _execute() done 30529 1726882636.52004: dumping result to json 30529 1726882636.52006: done dumping result, returning 30529 1726882636.52010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000010ff] 30529 1726882636.52016: sending task result for task 12673a56-9f93-b0f1-edc0-0000000010ff skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882636.52155: no more pending results, returning what we have 30529 1726882636.52159: results queue empty 30529 1726882636.52160: checking for any_errors_fatal 30529 1726882636.52168: done checking for any_errors_fatal 30529 1726882636.52169: checking for max_fail_percentage 30529 1726882636.52170: done checking for max_fail_percentage 30529 1726882636.52171: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.52172: done checking to see if all hosts have failed 30529 1726882636.52172: getting the remaining hosts for this loop 30529 1726882636.52174: done getting the remaining hosts for this loop 30529 1726882636.52178: getting the next task for host managed_node1 30529 1726882636.52186: done getting next task for host managed_node1 30529 1726882636.52190: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882636.52197: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.52216: getting variables 30529 1726882636.52218: in VariableManager get_vars() 30529 1726882636.52248: Calling all_inventory to load vars for managed_node1 30529 1726882636.52250: Calling groups_inventory to load vars for managed_node1 30529 1726882636.52252: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.52261: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.52264: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.52266: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.52806: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000010ff 30529 1726882636.52810: WORKER PROCESS EXITING 30529 1726882636.53024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.54162: done with get_vars() 30529 1726882636.54180: done getting variables 30529 1726882636.54238: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:16 -0400 (0:00:00.036) 0:00:50.568 ****** 30529 1726882636.54272: entering _queue_task() for managed_node1/service 30529 1726882636.54545: worker is 1 (out of 1 available) 30529 1726882636.54558: exiting _queue_task() for managed_node1/service 30529 1726882636.54571: done queuing things up, now waiting for results queue to drain 30529 1726882636.54572: waiting for pending results... 30529 1726882636.54781: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882636.54872: in run() - task 12673a56-9f93-b0f1-edc0-000000001100 30529 1726882636.54883: variable 'ansible_search_path' from source: unknown 30529 1726882636.54887: variable 'ansible_search_path' from source: unknown 30529 1726882636.54922: calling self._execute() 30529 1726882636.54995: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.54999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.55012: variable 'omit' from source: magic vars 30529 1726882636.55269: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.55278: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.55363: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.55491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.57071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.57294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.57298: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.57301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.57304: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.57380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.57413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.57496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.57529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.57560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.57682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.57685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.57688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.57690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.57705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.57736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.57750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.57771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.57799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.57810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.57977: variable 'network_connections' from source: include params 30529 1726882636.57983: variable 'interface' from source: play vars 30529 1726882636.58109: variable 'interface' from source: play vars 30529 1726882636.58167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882636.58376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882636.58781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882636.58861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882636.58866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882636.58958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882636.58961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882636.59145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.59148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882636.59151: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882636.59500: variable 'network_connections' from source: include params 30529 1726882636.59503: variable 'interface' from source: play vars 30529 1726882636.59509: variable 'interface' from source: play vars 30529 1726882636.59512: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882636.59514: when evaluation is False, skipping this task 30529 1726882636.59556: _execute() done 30529 1726882636.59573: dumping result to json 30529 1726882636.59577: done dumping result, returning 30529 1726882636.59579: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001100] 30529 1726882636.59581: sending task result for task 12673a56-9f93-b0f1-edc0-000000001100 30529 1726882636.59656: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001100 30529 1726882636.59665: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882636.59749: no more pending results, returning what we have 30529 1726882636.59752: results queue empty 30529 1726882636.59753: checking for any_errors_fatal 30529 1726882636.59760: done checking for any_errors_fatal 30529 1726882636.59760: checking for max_fail_percentage 30529 1726882636.59762: done checking for max_fail_percentage 30529 1726882636.59763: checking to see if all hosts have failed and the running result is not ok 30529 1726882636.59764: done checking to see if all hosts have failed 30529 1726882636.59764: getting the remaining hosts for this loop 30529 1726882636.59766: done getting the remaining hosts for this loop 30529 1726882636.59769: getting the next task for host managed_node1 30529 1726882636.59777: done getting next task for host managed_node1 30529 1726882636.59780: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882636.59785: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882636.59807: getting variables 30529 1726882636.59809: in VariableManager get_vars() 30529 1726882636.59844: Calling all_inventory to load vars for managed_node1 30529 1726882636.59847: Calling groups_inventory to load vars for managed_node1 30529 1726882636.59849: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882636.59858: Calling all_plugins_play to load vars for managed_node1 30529 1726882636.59860: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882636.59862: Calling groups_plugins_play to load vars for managed_node1 30529 1726882636.61205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882636.62248: done with get_vars() 30529 1726882636.62263: done getting variables 30529 1726882636.62309: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:16 -0400 (0:00:00.080) 0:00:50.649 ****** 30529 1726882636.62331: entering _queue_task() for managed_node1/service 30529 1726882636.62529: worker is 1 (out of 1 available) 30529 1726882636.62538: exiting _queue_task() for managed_node1/service 30529 1726882636.62548: done queuing things up, now waiting for results queue to drain 30529 1726882636.62549: waiting for pending results... 30529 1726882636.62790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882636.62880: in run() - task 12673a56-9f93-b0f1-edc0-000000001101 30529 1726882636.62889: variable 'ansible_search_path' from source: unknown 30529 1726882636.62897: variable 'ansible_search_path' from source: unknown 30529 1726882636.62924: calling self._execute() 30529 1726882636.62996: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.63000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.63010: variable 'omit' from source: magic vars 30529 1726882636.63274: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.63284: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882636.63412: variable 'network_provider' from source: set_fact 30529 1726882636.63415: variable 'network_state' from source: role '' defaults 30529 1726882636.63425: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882636.63431: variable 'omit' from source: magic vars 30529 1726882636.63473: variable 'omit' from source: magic vars 30529 1726882636.63495: variable 'network_service_name' from source: role '' defaults 30529 1726882636.63539: variable 'network_service_name' from source: role '' defaults 30529 1726882636.63611: variable '__network_provider_setup' from source: role '' defaults 30529 1726882636.63615: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882636.63662: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882636.63669: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882636.63715: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882636.63998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882636.66148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882636.66243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882636.66291: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882636.66330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882636.66354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882636.66432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.66455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.66497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.66529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.66685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.66691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.66695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.66697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.66725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.66731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.66969: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882636.67055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.67062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.67080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.67109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.67177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.67217: variable 'ansible_python' from source: facts 30529 1726882636.67239: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882636.67296: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882636.67598: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882636.67601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.67604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.67607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.67649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.67667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.67727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882636.67763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882636.67796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.67846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882636.67865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882636.68044: variable 'network_connections' from source: include params 30529 1726882636.68065: variable 'interface' from source: play vars 30529 1726882636.68164: variable 'interface' from source: play vars 30529 1726882636.68248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882636.68459: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882636.68650: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882636.68847: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882636.68898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882636.69048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882636.69051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882636.69053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882636.69105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882636.69231: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.69560: variable 'network_connections' from source: include params 30529 1726882636.69571: variable 'interface' from source: play vars 30529 1726882636.69692: variable 'interface' from source: play vars 30529 1726882636.69757: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882636.69855: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882636.70378: variable 'network_connections' from source: include params 30529 1726882636.70381: variable 'interface' from source: play vars 30529 1726882636.70598: variable 'interface' from source: play vars 30529 1726882636.70624: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882636.70714: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882636.71153: variable 'network_connections' from source: include params 30529 1726882636.71163: variable 'interface' from source: play vars 30529 1726882636.71372: variable 'interface' from source: play vars 30529 1726882636.71577: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882636.71806: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882636.71809: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882636.71811: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882636.72330: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882636.72821: variable 'network_connections' from source: include params 30529 1726882636.72832: variable 'interface' from source: play vars 30529 1726882636.72882: variable 'interface' from source: play vars 30529 1726882636.72890: variable 'ansible_distribution' from source: facts 30529 1726882636.72900: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.72902: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.72935: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882636.73046: variable 'ansible_distribution' from source: facts 30529 1726882636.73049: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.73054: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.73062: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882636.73174: variable 'ansible_distribution' from source: facts 30529 1726882636.73178: variable '__network_rh_distros' from source: role '' defaults 30529 1726882636.73181: variable 'ansible_distribution_major_version' from source: facts 30529 1726882636.73213: variable 'network_provider' from source: set_fact 30529 1726882636.73233: variable 'omit' from source: magic vars 30529 1726882636.73254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882636.73274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882636.73289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882636.73306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882636.73314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882636.73339: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882636.73342: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.73344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.73417: Set connection var ansible_shell_executable to /bin/sh 30529 1726882636.73420: Set connection var ansible_pipelining to False 30529 1726882636.73423: Set connection var ansible_shell_type to sh 30529 1726882636.73431: Set connection var ansible_timeout to 10 30529 1726882636.73435: Set connection var ansible_connection to ssh 30529 1726882636.73437: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882636.73460: variable 'ansible_shell_executable' from source: unknown 30529 1726882636.73463: variable 'ansible_connection' from source: unknown 30529 1726882636.73466: variable 'ansible_module_compression' from source: unknown 30529 1726882636.73468: variable 'ansible_shell_type' from source: unknown 30529 1726882636.73470: variable 'ansible_shell_executable' from source: unknown 30529 1726882636.73483: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882636.73487: variable 'ansible_pipelining' from source: unknown 30529 1726882636.73489: variable 'ansible_timeout' from source: unknown 30529 1726882636.73491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882636.73573: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882636.73583: variable 'omit' from source: magic vars 30529 1726882636.73588: starting attempt loop 30529 1726882636.73591: running the handler 30529 1726882636.73648: variable 'ansible_facts' from source: unknown 30529 1726882636.74351: _low_level_execute_command(): starting 30529 1726882636.74355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882636.75241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882636.75253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882636.75273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.75277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882636.75347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882636.75351: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882636.75439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.75448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882636.75452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882636.75645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882636.75682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882636.77356: stdout chunk (state=3): >>>/root <<< 30529 1726882636.77456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882636.77498: stderr chunk (state=3): >>><<< 30529 1726882636.77502: stdout chunk (state=3): >>><<< 30529 1726882636.77505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882636.77508: _low_level_execute_command(): starting 30529 1726882636.77517: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951 `" && echo ansible-tmp-1726882636.7749834-33003-16937647128951="` echo /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951 `" ) && sleep 0' 30529 1726882636.78088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882636.78101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882636.78105: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.78108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.78110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.78142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882636.78215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882636.80038: stdout chunk (state=3): >>>ansible-tmp-1726882636.7749834-33003-16937647128951=/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951 <<< 30529 1726882636.80201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882636.80205: stdout chunk (state=3): >>><<< 30529 1726882636.80207: stderr chunk (state=3): >>><<< 30529 1726882636.80226: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882636.7749834-33003-16937647128951=/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882636.80398: variable 'ansible_module_compression' from source: unknown 30529 1726882636.80401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882636.80404: variable 'ansible_facts' from source: unknown 30529 1726882636.80651: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py 30529 1726882636.81023: Sending initial data 30529 1726882636.81026: Sent initial data (155 bytes) 30529 1726882636.81874: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882636.81882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882636.81897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.82056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882636.82110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.82306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882636.82444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882636.82508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882636.84010: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882636.84022: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882636.84027: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882636.84073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882636.84118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpq9rgae9v /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py <<< 30529 1726882636.84121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py" <<< 30529 1726882636.84162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpq9rgae9v" to remote "/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py" <<< 30529 1726882636.86029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882636.86060: stderr chunk (state=3): >>><<< 30529 1726882636.86063: stdout chunk (state=3): >>><<< 30529 1726882636.86082: done transferring module to remote 30529 1726882636.86094: _low_level_execute_command(): starting 30529 1726882636.86097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/ /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py && sleep 0' 30529 1726882636.86507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.86512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.86514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882636.86516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.86518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.86565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882636.86569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882636.86616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882636.88404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882636.88407: stdout chunk (state=3): >>><<< 30529 1726882636.88410: stderr chunk (state=3): >>><<< 30529 1726882636.88412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882636.88415: _low_level_execute_command(): starting 30529 1726882636.88417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/AnsiballZ_systemd.py && sleep 0' 30529 1726882636.88934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882636.88949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882636.88959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882636.89009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882636.89012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882636.89014: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882636.89016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.89049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882636.89119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882636.89131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882636.89208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.17691: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10842112", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301658624", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1788839000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882637.17727: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882637.19799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882637.19803: stdout chunk (state=3): >>><<< 30529 1726882637.19805: stderr chunk (state=3): >>><<< 30529 1726882637.19808: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10842112", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301658624", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1788839000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882637.20213: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882637.20216: _low_level_execute_command(): starting 30529 1726882637.20219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882636.7749834-33003-16937647128951/ > /dev/null 2>&1 && sleep 0' 30529 1726882637.21399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882637.21403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.21525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.21556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.21611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.23434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882637.23592: stderr chunk (state=3): >>><<< 30529 1726882637.23599: stdout chunk (state=3): >>><<< 30529 1726882637.23602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882637.23604: handler run complete 30529 1726882637.23634: attempt loop complete, returning result 30529 1726882637.23638: _execute() done 30529 1726882637.23640: dumping result to json 30529 1726882637.23669: done dumping result, returning 30529 1726882637.23672: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000001101] 30529 1726882637.23674: sending task result for task 12673a56-9f93-b0f1-edc0-000000001101 30529 1726882637.24201: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001101 30529 1726882637.24310: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882637.24363: no more pending results, returning what we have 30529 1726882637.24366: results queue empty 30529 1726882637.24367: checking for any_errors_fatal 30529 1726882637.24373: done checking for any_errors_fatal 30529 1726882637.24373: checking for max_fail_percentage 30529 1726882637.24375: done checking for max_fail_percentage 30529 1726882637.24376: checking to see if all hosts have failed and the running result is not ok 30529 1726882637.24377: done checking to see if all hosts have failed 30529 1726882637.24378: getting the remaining hosts for this loop 30529 1726882637.24379: done getting the remaining hosts for this loop 30529 1726882637.24383: getting the next task for host managed_node1 30529 1726882637.24409: done getting next task for host managed_node1 30529 1726882637.24413: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882637.24424: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882637.24438: getting variables 30529 1726882637.24440: in VariableManager get_vars() 30529 1726882637.24473: Calling all_inventory to load vars for managed_node1 30529 1726882637.24475: Calling groups_inventory to load vars for managed_node1 30529 1726882637.24477: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882637.24487: Calling all_plugins_play to load vars for managed_node1 30529 1726882637.24490: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882637.24495: Calling groups_plugins_play to load vars for managed_node1 30529 1726882637.25917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882637.27359: done with get_vars() 30529 1726882637.27375: done getting variables 30529 1726882637.27420: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:17 -0400 (0:00:00.651) 0:00:51.300 ****** 30529 1726882637.27448: entering _queue_task() for managed_node1/service 30529 1726882637.27695: worker is 1 (out of 1 available) 30529 1726882637.27710: exiting _queue_task() for managed_node1/service 30529 1726882637.27723: done queuing things up, now waiting for results queue to drain 30529 1726882637.27724: waiting for pending results... 30529 1726882637.27918: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882637.28017: in run() - task 12673a56-9f93-b0f1-edc0-000000001102 30529 1726882637.28029: variable 'ansible_search_path' from source: unknown 30529 1726882637.28033: variable 'ansible_search_path' from source: unknown 30529 1726882637.28063: calling self._execute() 30529 1726882637.28134: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.28137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.28145: variable 'omit' from source: magic vars 30529 1726882637.28423: variable 'ansible_distribution_major_version' from source: facts 30529 1726882637.28432: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882637.28515: variable 'network_provider' from source: set_fact 30529 1726882637.28519: Evaluated conditional (network_provider == "nm"): True 30529 1726882637.28581: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882637.28650: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882637.28777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882637.30740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882637.30783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882637.30814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882637.30841: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882637.30861: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882637.30922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882637.30947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882637.30964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882637.30990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882637.31009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882637.31043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882637.31060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882637.31077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882637.31107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882637.31118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882637.31147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882637.31165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882637.31182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882637.31211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882637.31222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882637.31323: variable 'network_connections' from source: include params 30529 1726882637.31330: variable 'interface' from source: play vars 30529 1726882637.31376: variable 'interface' from source: play vars 30529 1726882637.31438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882637.31547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882637.31573: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882637.31601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882637.31624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882637.31653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882637.31668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882637.31684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882637.31714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882637.31758: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882637.31977: variable 'network_connections' from source: include params 30529 1726882637.32198: variable 'interface' from source: play vars 30529 1726882637.32201: variable 'interface' from source: play vars 30529 1726882637.32203: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882637.32205: when evaluation is False, skipping this task 30529 1726882637.32207: _execute() done 30529 1726882637.32209: dumping result to json 30529 1726882637.32211: done dumping result, returning 30529 1726882637.32213: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000001102] 30529 1726882637.32226: sending task result for task 12673a56-9f93-b0f1-edc0-000000001102 30529 1726882637.32286: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001102 30529 1726882637.32289: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882637.32371: no more pending results, returning what we have 30529 1726882637.32374: results queue empty 30529 1726882637.32375: checking for any_errors_fatal 30529 1726882637.32395: done checking for any_errors_fatal 30529 1726882637.32396: checking for max_fail_percentage 30529 1726882637.32398: done checking for max_fail_percentage 30529 1726882637.32399: checking to see if all hosts have failed and the running result is not ok 30529 1726882637.32399: done checking to see if all hosts have failed 30529 1726882637.32400: getting the remaining hosts for this loop 30529 1726882637.32402: done getting the remaining hosts for this loop 30529 1726882637.32405: getting the next task for host managed_node1 30529 1726882637.32412: done getting next task for host managed_node1 30529 1726882637.32415: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882637.32433: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882637.32450: getting variables 30529 1726882637.32452: in VariableManager get_vars() 30529 1726882637.32483: Calling all_inventory to load vars for managed_node1 30529 1726882637.32485: Calling groups_inventory to load vars for managed_node1 30529 1726882637.32487: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882637.32498: Calling all_plugins_play to load vars for managed_node1 30529 1726882637.32501: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882637.32504: Calling groups_plugins_play to load vars for managed_node1 30529 1726882637.33711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882637.34574: done with get_vars() 30529 1726882637.34591: done getting variables 30529 1726882637.34633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:17 -0400 (0:00:00.072) 0:00:51.372 ****** 30529 1726882637.34655: entering _queue_task() for managed_node1/service 30529 1726882637.34912: worker is 1 (out of 1 available) 30529 1726882637.34927: exiting _queue_task() for managed_node1/service 30529 1726882637.34940: done queuing things up, now waiting for results queue to drain 30529 1726882637.34942: waiting for pending results... 30529 1726882637.35243: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882637.35394: in run() - task 12673a56-9f93-b0f1-edc0-000000001103 30529 1726882637.35422: variable 'ansible_search_path' from source: unknown 30529 1726882637.35433: variable 'ansible_search_path' from source: unknown 30529 1726882637.35482: calling self._execute() 30529 1726882637.35575: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.35587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.35620: variable 'omit' from source: magic vars 30529 1726882637.35998: variable 'ansible_distribution_major_version' from source: facts 30529 1726882637.36005: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882637.36085: variable 'network_provider' from source: set_fact 30529 1726882637.36102: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882637.36110: when evaluation is False, skipping this task 30529 1726882637.36117: _execute() done 30529 1726882637.36125: dumping result to json 30529 1726882637.36133: done dumping result, returning 30529 1726882637.36144: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000001103] 30529 1726882637.36153: sending task result for task 12673a56-9f93-b0f1-edc0-000000001103 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882637.36297: no more pending results, returning what we have 30529 1726882637.36301: results queue empty 30529 1726882637.36302: checking for any_errors_fatal 30529 1726882637.36314: done checking for any_errors_fatal 30529 1726882637.36314: checking for max_fail_percentage 30529 1726882637.36316: done checking for max_fail_percentage 30529 1726882637.36317: checking to see if all hosts have failed and the running result is not ok 30529 1726882637.36318: done checking to see if all hosts have failed 30529 1726882637.36319: getting the remaining hosts for this loop 30529 1726882637.36320: done getting the remaining hosts for this loop 30529 1726882637.36324: getting the next task for host managed_node1 30529 1726882637.36333: done getting next task for host managed_node1 30529 1726882637.36338: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882637.36343: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882637.36366: getting variables 30529 1726882637.36368: in VariableManager get_vars() 30529 1726882637.36408: Calling all_inventory to load vars for managed_node1 30529 1726882637.36411: Calling groups_inventory to load vars for managed_node1 30529 1726882637.36413: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882637.36426: Calling all_plugins_play to load vars for managed_node1 30529 1726882637.36429: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882637.36433: Calling groups_plugins_play to load vars for managed_node1 30529 1726882637.37435: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001103 30529 1726882637.37439: WORKER PROCESS EXITING 30529 1726882637.37451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882637.38416: done with get_vars() 30529 1726882637.38435: done getting variables 30529 1726882637.38492: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:17 -0400 (0:00:00.038) 0:00:51.411 ****** 30529 1726882637.38525: entering _queue_task() for managed_node1/copy 30529 1726882637.38830: worker is 1 (out of 1 available) 30529 1726882637.38842: exiting _queue_task() for managed_node1/copy 30529 1726882637.38853: done queuing things up, now waiting for results queue to drain 30529 1726882637.38855: waiting for pending results... 30529 1726882637.39209: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882637.39215: in run() - task 12673a56-9f93-b0f1-edc0-000000001104 30529 1726882637.39219: variable 'ansible_search_path' from source: unknown 30529 1726882637.39222: variable 'ansible_search_path' from source: unknown 30529 1726882637.39257: calling self._execute() 30529 1726882637.39356: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.39369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.39382: variable 'omit' from source: magic vars 30529 1726882637.39678: variable 'ansible_distribution_major_version' from source: facts 30529 1726882637.39697: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882637.39769: variable 'network_provider' from source: set_fact 30529 1726882637.39773: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882637.39776: when evaluation is False, skipping this task 30529 1726882637.39779: _execute() done 30529 1726882637.39781: dumping result to json 30529 1726882637.39784: done dumping result, returning 30529 1726882637.39797: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000001104] 30529 1726882637.39800: sending task result for task 12673a56-9f93-b0f1-edc0-000000001104 30529 1726882637.39885: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001104 30529 1726882637.39888: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882637.39955: no more pending results, returning what we have 30529 1726882637.39958: results queue empty 30529 1726882637.39959: checking for any_errors_fatal 30529 1726882637.39963: done checking for any_errors_fatal 30529 1726882637.39964: checking for max_fail_percentage 30529 1726882637.39965: done checking for max_fail_percentage 30529 1726882637.39966: checking to see if all hosts have failed and the running result is not ok 30529 1726882637.39967: done checking to see if all hosts have failed 30529 1726882637.39967: getting the remaining hosts for this loop 30529 1726882637.39969: done getting the remaining hosts for this loop 30529 1726882637.39972: getting the next task for host managed_node1 30529 1726882637.39978: done getting next task for host managed_node1 30529 1726882637.39981: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882637.39986: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882637.40007: getting variables 30529 1726882637.40009: in VariableManager get_vars() 30529 1726882637.40040: Calling all_inventory to load vars for managed_node1 30529 1726882637.40042: Calling groups_inventory to load vars for managed_node1 30529 1726882637.40044: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882637.40052: Calling all_plugins_play to load vars for managed_node1 30529 1726882637.40055: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882637.40058: Calling groups_plugins_play to load vars for managed_node1 30529 1726882637.40785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882637.41652: done with get_vars() 30529 1726882637.41666: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:17 -0400 (0:00:00.032) 0:00:51.443 ****** 30529 1726882637.41727: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882637.41922: worker is 1 (out of 1 available) 30529 1726882637.41937: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882637.41950: done queuing things up, now waiting for results queue to drain 30529 1726882637.41951: waiting for pending results... 30529 1726882637.42118: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882637.42297: in run() - task 12673a56-9f93-b0f1-edc0-000000001105 30529 1726882637.42301: variable 'ansible_search_path' from source: unknown 30529 1726882637.42304: variable 'ansible_search_path' from source: unknown 30529 1726882637.42342: calling self._execute() 30529 1726882637.42472: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.42485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.42504: variable 'omit' from source: magic vars 30529 1726882637.42757: variable 'ansible_distribution_major_version' from source: facts 30529 1726882637.42769: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882637.42772: variable 'omit' from source: magic vars 30529 1726882637.42815: variable 'omit' from source: magic vars 30529 1726882637.42922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882637.45098: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882637.45101: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882637.45113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882637.45151: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882637.45182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882637.45262: variable 'network_provider' from source: set_fact 30529 1726882637.45401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882637.45433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882637.45459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882637.45505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882637.45523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882637.45597: variable 'omit' from source: magic vars 30529 1726882637.45701: variable 'omit' from source: magic vars 30529 1726882637.45804: variable 'network_connections' from source: include params 30529 1726882637.45820: variable 'interface' from source: play vars 30529 1726882637.45884: variable 'interface' from source: play vars 30529 1726882637.46051: variable 'omit' from source: magic vars 30529 1726882637.46063: variable '__lsr_ansible_managed' from source: task vars 30529 1726882637.46128: variable '__lsr_ansible_managed' from source: task vars 30529 1726882637.46698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882637.46834: Loaded config def from plugin (lookup/template) 30529 1726882637.46844: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882637.46872: File lookup term: get_ansible_managed.j2 30529 1726882637.46879: variable 'ansible_search_path' from source: unknown 30529 1726882637.46888: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882637.46911: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882637.46934: variable 'ansible_search_path' from source: unknown 30529 1726882637.51189: variable 'ansible_managed' from source: unknown 30529 1726882637.51338: variable 'omit' from source: magic vars 30529 1726882637.51371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882637.51498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882637.51502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882637.51505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882637.51508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882637.51511: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882637.51514: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.51517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.51608: Set connection var ansible_shell_executable to /bin/sh 30529 1726882637.51621: Set connection var ansible_pipelining to False 30529 1726882637.51627: Set connection var ansible_shell_type to sh 30529 1726882637.51643: Set connection var ansible_timeout to 10 30529 1726882637.51649: Set connection var ansible_connection to ssh 30529 1726882637.51657: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882637.51683: variable 'ansible_shell_executable' from source: unknown 30529 1726882637.51695: variable 'ansible_connection' from source: unknown 30529 1726882637.51703: variable 'ansible_module_compression' from source: unknown 30529 1726882637.51709: variable 'ansible_shell_type' from source: unknown 30529 1726882637.51715: variable 'ansible_shell_executable' from source: unknown 30529 1726882637.51722: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882637.51729: variable 'ansible_pipelining' from source: unknown 30529 1726882637.51735: variable 'ansible_timeout' from source: unknown 30529 1726882637.51742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882637.51875: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882637.52047: variable 'omit' from source: magic vars 30529 1726882637.52051: starting attempt loop 30529 1726882637.52053: running the handler 30529 1726882637.52056: _low_level_execute_command(): starting 30529 1726882637.52058: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882637.52752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.52815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882637.52858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.52890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.52949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.54616: stdout chunk (state=3): >>>/root <<< 30529 1726882637.54746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882637.54749: stdout chunk (state=3): >>><<< 30529 1726882637.54751: stderr chunk (state=3): >>><<< 30529 1726882637.54767: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882637.54792: _low_level_execute_command(): starting 30529 1726882637.54868: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660 `" && echo ansible-tmp-1726882637.5477214-33052-71311157080660="` echo /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660 `" ) && sleep 0' 30529 1726882637.55545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882637.55548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882637.55551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882637.55553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882637.55561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882637.55569: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882637.55579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.55596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882637.55653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882637.55656: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882637.55659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882637.55661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882637.55663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882637.55665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.55726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.55740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.55809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.57646: stdout chunk (state=3): >>>ansible-tmp-1726882637.5477214-33052-71311157080660=/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660 <<< 30529 1726882637.57819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882637.57822: stdout chunk (state=3): >>><<< 30529 1726882637.57824: stderr chunk (state=3): >>><<< 30529 1726882637.57843: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882637.5477214-33052-71311157080660=/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882637.57998: variable 'ansible_module_compression' from source: unknown 30529 1726882637.58001: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882637.58003: variable 'ansible_facts' from source: unknown 30529 1726882637.58153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py 30529 1726882637.58360: Sending initial data 30529 1726882637.58363: Sent initial data (167 bytes) 30529 1726882637.58962: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882637.59008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.59024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882637.59118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882637.59135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.59154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.59327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.60920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882637.60979: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882637.61032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpsrp2uisb /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py <<< 30529 1726882637.61035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py" <<< 30529 1726882637.61068: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpsrp2uisb" to remote "/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py" <<< 30529 1726882637.63136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882637.63140: stdout chunk (state=3): >>><<< 30529 1726882637.63142: stderr chunk (state=3): >>><<< 30529 1726882637.63144: done transferring module to remote 30529 1726882637.63146: _low_level_execute_command(): starting 30529 1726882637.63149: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/ /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py && sleep 0' 30529 1726882637.64472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882637.64594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.64758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.64801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.64944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.66616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882637.66883: stderr chunk (state=3): >>><<< 30529 1726882637.66886: stdout chunk (state=3): >>><<< 30529 1726882637.66892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882637.66897: _low_level_execute_command(): starting 30529 1726882637.66899: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/AnsiballZ_network_connections.py && sleep 0' 30529 1726882637.67803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882637.67816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882637.67832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882637.67857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882637.67880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882637.67968: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.68010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882637.68028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.68119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.68211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882637.95407: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882637.98202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882637.98206: stdout chunk (state=3): >>><<< 30529 1726882637.98208: stderr chunk (state=3): >>><<< 30529 1726882637.98210: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882637.98212: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882637.98214: _low_level_execute_command(): starting 30529 1726882637.98216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882637.5477214-33052-71311157080660/ > /dev/null 2>&1 && sleep 0' 30529 1726882637.99037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882637.99098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882637.99124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882637.99161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882637.99202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.01266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.01269: stdout chunk (state=3): >>><<< 30529 1726882638.01272: stderr chunk (state=3): >>><<< 30529 1726882638.01274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882638.01276: handler run complete 30529 1726882638.01303: attempt loop complete, returning result 30529 1726882638.01306: _execute() done 30529 1726882638.01309: dumping result to json 30529 1726882638.01369: done dumping result, returning 30529 1726882638.01372: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000001105] 30529 1726882638.01379: sending task result for task 12673a56-9f93-b0f1-edc0-000000001105 30529 1726882638.01630: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001105 30529 1726882638.01634: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 30529 1726882638.01735: no more pending results, returning what we have 30529 1726882638.01738: results queue empty 30529 1726882638.01739: checking for any_errors_fatal 30529 1726882638.01746: done checking for any_errors_fatal 30529 1726882638.01747: checking for max_fail_percentage 30529 1726882638.01748: done checking for max_fail_percentage 30529 1726882638.01749: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.01750: done checking to see if all hosts have failed 30529 1726882638.01751: getting the remaining hosts for this loop 30529 1726882638.01752: done getting the remaining hosts for this loop 30529 1726882638.01755: getting the next task for host managed_node1 30529 1726882638.01763: done getting next task for host managed_node1 30529 1726882638.01766: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882638.01771: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.01782: getting variables 30529 1726882638.01783: in VariableManager get_vars() 30529 1726882638.01966: Calling all_inventory to load vars for managed_node1 30529 1726882638.01969: Calling groups_inventory to load vars for managed_node1 30529 1726882638.01971: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.01980: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.01983: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.01985: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.03720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.05961: done with get_vars() 30529 1726882638.05984: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:18 -0400 (0:00:00.643) 0:00:52.087 ****** 30529 1726882638.06096: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882638.06614: worker is 1 (out of 1 available) 30529 1726882638.06629: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882638.06644: done queuing things up, now waiting for results queue to drain 30529 1726882638.06645: waiting for pending results... 30529 1726882638.06916: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882638.07081: in run() - task 12673a56-9f93-b0f1-edc0-000000001106 30529 1726882638.07109: variable 'ansible_search_path' from source: unknown 30529 1726882638.07118: variable 'ansible_search_path' from source: unknown 30529 1726882638.07205: calling self._execute() 30529 1726882638.07250: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.07254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.07268: variable 'omit' from source: magic vars 30529 1726882638.07550: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.07559: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.07649: variable 'network_state' from source: role '' defaults 30529 1726882638.07656: Evaluated conditional (network_state != {}): False 30529 1726882638.07659: when evaluation is False, skipping this task 30529 1726882638.07661: _execute() done 30529 1726882638.07664: dumping result to json 30529 1726882638.07667: done dumping result, returning 30529 1726882638.07674: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000001106] 30529 1726882638.07677: sending task result for task 12673a56-9f93-b0f1-edc0-000000001106 30529 1726882638.07766: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001106 30529 1726882638.07769: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882638.07819: no more pending results, returning what we have 30529 1726882638.07823: results queue empty 30529 1726882638.07824: checking for any_errors_fatal 30529 1726882638.07834: done checking for any_errors_fatal 30529 1726882638.07834: checking for max_fail_percentage 30529 1726882638.07836: done checking for max_fail_percentage 30529 1726882638.07837: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.07838: done checking to see if all hosts have failed 30529 1726882638.07839: getting the remaining hosts for this loop 30529 1726882638.07840: done getting the remaining hosts for this loop 30529 1726882638.07844: getting the next task for host managed_node1 30529 1726882638.07852: done getting next task for host managed_node1 30529 1726882638.07856: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882638.07890: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.07912: getting variables 30529 1726882638.07914: in VariableManager get_vars() 30529 1726882638.07966: Calling all_inventory to load vars for managed_node1 30529 1726882638.07968: Calling groups_inventory to load vars for managed_node1 30529 1726882638.07970: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.07979: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.08009: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.08014: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.09555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.11311: done with get_vars() 30529 1726882638.11325: done getting variables 30529 1726882638.11372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:18 -0400 (0:00:00.053) 0:00:52.140 ****** 30529 1726882638.11401: entering _queue_task() for managed_node1/debug 30529 1726882638.11612: worker is 1 (out of 1 available) 30529 1726882638.11626: exiting _queue_task() for managed_node1/debug 30529 1726882638.11639: done queuing things up, now waiting for results queue to drain 30529 1726882638.11640: waiting for pending results... 30529 1726882638.11823: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882638.11907: in run() - task 12673a56-9f93-b0f1-edc0-000000001107 30529 1726882638.11918: variable 'ansible_search_path' from source: unknown 30529 1726882638.11922: variable 'ansible_search_path' from source: unknown 30529 1726882638.11955: calling self._execute() 30529 1726882638.12030: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.12040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.12049: variable 'omit' from source: magic vars 30529 1726882638.12420: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.12423: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.12426: variable 'omit' from source: magic vars 30529 1726882638.12429: variable 'omit' from source: magic vars 30529 1726882638.12601: variable 'omit' from source: magic vars 30529 1726882638.12605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882638.12608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882638.12611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882638.12613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.12615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.12617: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882638.12619: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.12621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.12714: Set connection var ansible_shell_executable to /bin/sh 30529 1726882638.12718: Set connection var ansible_pipelining to False 30529 1726882638.12721: Set connection var ansible_shell_type to sh 30529 1726882638.12737: Set connection var ansible_timeout to 10 30529 1726882638.12740: Set connection var ansible_connection to ssh 30529 1726882638.12742: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882638.12759: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.12762: variable 'ansible_connection' from source: unknown 30529 1726882638.12765: variable 'ansible_module_compression' from source: unknown 30529 1726882638.12767: variable 'ansible_shell_type' from source: unknown 30529 1726882638.12770: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.12772: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.12777: variable 'ansible_pipelining' from source: unknown 30529 1726882638.12779: variable 'ansible_timeout' from source: unknown 30529 1726882638.12818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.13035: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882638.13039: variable 'omit' from source: magic vars 30529 1726882638.13042: starting attempt loop 30529 1726882638.13044: running the handler 30529 1726882638.13253: variable '__network_connections_result' from source: set_fact 30529 1726882638.13449: handler run complete 30529 1726882638.13519: attempt loop complete, returning result 30529 1726882638.13532: _execute() done 30529 1726882638.13538: dumping result to json 30529 1726882638.13545: done dumping result, returning 30529 1726882638.13557: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001107] 30529 1726882638.13566: sending task result for task 12673a56-9f93-b0f1-edc0-000000001107 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239" ] } 30529 1726882638.13952: no more pending results, returning what we have 30529 1726882638.13956: results queue empty 30529 1726882638.13957: checking for any_errors_fatal 30529 1726882638.13962: done checking for any_errors_fatal 30529 1726882638.13963: checking for max_fail_percentage 30529 1726882638.13965: done checking for max_fail_percentage 30529 1726882638.13965: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.13966: done checking to see if all hosts have failed 30529 1726882638.13967: getting the remaining hosts for this loop 30529 1726882638.13969: done getting the remaining hosts for this loop 30529 1726882638.13972: getting the next task for host managed_node1 30529 1726882638.13980: done getting next task for host managed_node1 30529 1726882638.13987: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882638.13995: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.14011: getting variables 30529 1726882638.14014: in VariableManager get_vars() 30529 1726882638.14052: Calling all_inventory to load vars for managed_node1 30529 1726882638.14054: Calling groups_inventory to load vars for managed_node1 30529 1726882638.14057: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.14065: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.14068: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.14070: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.14976: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001107 30529 1726882638.14980: WORKER PROCESS EXITING 30529 1726882638.14992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.16075: done with get_vars() 30529 1726882638.16100: done getting variables 30529 1726882638.16166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:18 -0400 (0:00:00.047) 0:00:52.188 ****** 30529 1726882638.16197: entering _queue_task() for managed_node1/debug 30529 1726882638.16521: worker is 1 (out of 1 available) 30529 1726882638.16533: exiting _queue_task() for managed_node1/debug 30529 1726882638.16546: done queuing things up, now waiting for results queue to drain 30529 1726882638.16548: waiting for pending results... 30529 1726882638.16746: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882638.17059: in run() - task 12673a56-9f93-b0f1-edc0-000000001108 30529 1726882638.17064: variable 'ansible_search_path' from source: unknown 30529 1726882638.17067: variable 'ansible_search_path' from source: unknown 30529 1726882638.17070: calling self._execute() 30529 1726882638.17074: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.17088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.17108: variable 'omit' from source: magic vars 30529 1726882638.17580: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.17583: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.17586: variable 'omit' from source: magic vars 30529 1726882638.17671: variable 'omit' from source: magic vars 30529 1726882638.17716: variable 'omit' from source: magic vars 30529 1726882638.17799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882638.17828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882638.17861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882638.17918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.17923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.17950: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882638.17958: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.17964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.18161: Set connection var ansible_shell_executable to /bin/sh 30529 1726882638.18191: Set connection var ansible_pipelining to False 30529 1726882638.18196: Set connection var ansible_shell_type to sh 30529 1726882638.18201: Set connection var ansible_timeout to 10 30529 1726882638.18203: Set connection var ansible_connection to ssh 30529 1726882638.18205: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882638.18207: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.18209: variable 'ansible_connection' from source: unknown 30529 1726882638.18212: variable 'ansible_module_compression' from source: unknown 30529 1726882638.18214: variable 'ansible_shell_type' from source: unknown 30529 1726882638.18216: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.18218: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.18220: variable 'ansible_pipelining' from source: unknown 30529 1726882638.18222: variable 'ansible_timeout' from source: unknown 30529 1726882638.18223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.18499: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882638.18508: variable 'omit' from source: magic vars 30529 1726882638.18527: starting attempt loop 30529 1726882638.18532: running the handler 30529 1726882638.18534: variable '__network_connections_result' from source: set_fact 30529 1726882638.18578: variable '__network_connections_result' from source: set_fact 30529 1726882638.18668: handler run complete 30529 1726882638.18684: attempt loop complete, returning result 30529 1726882638.18687: _execute() done 30529 1726882638.18695: dumping result to json 30529 1726882638.18698: done dumping result, returning 30529 1726882638.18703: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001108] 30529 1726882638.18707: sending task result for task 12673a56-9f93-b0f1-edc0-000000001108 30529 1726882638.18813: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001108 30529 1726882638.18817: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239" ] } } 30529 1726882638.18944: no more pending results, returning what we have 30529 1726882638.18949: results queue empty 30529 1726882638.18949: checking for any_errors_fatal 30529 1726882638.18957: done checking for any_errors_fatal 30529 1726882638.18958: checking for max_fail_percentage 30529 1726882638.18960: done checking for max_fail_percentage 30529 1726882638.18960: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.18961: done checking to see if all hosts have failed 30529 1726882638.18962: getting the remaining hosts for this loop 30529 1726882638.18963: done getting the remaining hosts for this loop 30529 1726882638.18967: getting the next task for host managed_node1 30529 1726882638.18975: done getting next task for host managed_node1 30529 1726882638.18979: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882638.18982: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.19000: getting variables 30529 1726882638.19002: in VariableManager get_vars() 30529 1726882638.19034: Calling all_inventory to load vars for managed_node1 30529 1726882638.19037: Calling groups_inventory to load vars for managed_node1 30529 1726882638.19039: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.19063: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.19071: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.19075: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.20122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.21243: done with get_vars() 30529 1726882638.21257: done getting variables 30529 1726882638.21300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:18 -0400 (0:00:00.051) 0:00:52.239 ****** 30529 1726882638.21322: entering _queue_task() for managed_node1/debug 30529 1726882638.21539: worker is 1 (out of 1 available) 30529 1726882638.21552: exiting _queue_task() for managed_node1/debug 30529 1726882638.21564: done queuing things up, now waiting for results queue to drain 30529 1726882638.21566: waiting for pending results... 30529 1726882638.21743: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882638.21818: in run() - task 12673a56-9f93-b0f1-edc0-000000001109 30529 1726882638.21830: variable 'ansible_search_path' from source: unknown 30529 1726882638.21833: variable 'ansible_search_path' from source: unknown 30529 1726882638.21859: calling self._execute() 30529 1726882638.21933: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.21937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.21945: variable 'omit' from source: magic vars 30529 1726882638.22213: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.22228: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.22307: variable 'network_state' from source: role '' defaults 30529 1726882638.22316: Evaluated conditional (network_state != {}): False 30529 1726882638.22319: when evaluation is False, skipping this task 30529 1726882638.22321: _execute() done 30529 1726882638.22324: dumping result to json 30529 1726882638.22327: done dumping result, returning 30529 1726882638.22335: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000001109] 30529 1726882638.22339: sending task result for task 12673a56-9f93-b0f1-edc0-000000001109 30529 1726882638.22429: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001109 30529 1726882638.22432: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882638.22485: no more pending results, returning what we have 30529 1726882638.22489: results queue empty 30529 1726882638.22492: checking for any_errors_fatal 30529 1726882638.22501: done checking for any_errors_fatal 30529 1726882638.22502: checking for max_fail_percentage 30529 1726882638.22504: done checking for max_fail_percentage 30529 1726882638.22505: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.22506: done checking to see if all hosts have failed 30529 1726882638.22507: getting the remaining hosts for this loop 30529 1726882638.22508: done getting the remaining hosts for this loop 30529 1726882638.22512: getting the next task for host managed_node1 30529 1726882638.22518: done getting next task for host managed_node1 30529 1726882638.22522: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882638.22526: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.22542: getting variables 30529 1726882638.22544: in VariableManager get_vars() 30529 1726882638.22575: Calling all_inventory to load vars for managed_node1 30529 1726882638.22578: Calling groups_inventory to load vars for managed_node1 30529 1726882638.22580: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.22587: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.22592: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.22602: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.23589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.26265: done with get_vars() 30529 1726882638.26297: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:18 -0400 (0:00:00.050) 0:00:52.290 ****** 30529 1726882638.26423: entering _queue_task() for managed_node1/ping 30529 1726882638.26830: worker is 1 (out of 1 available) 30529 1726882638.26845: exiting _queue_task() for managed_node1/ping 30529 1726882638.26863: done queuing things up, now waiting for results queue to drain 30529 1726882638.26865: waiting for pending results... 30529 1726882638.27111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882638.27260: in run() - task 12673a56-9f93-b0f1-edc0-00000000110a 30529 1726882638.27437: variable 'ansible_search_path' from source: unknown 30529 1726882638.27440: variable 'ansible_search_path' from source: unknown 30529 1726882638.27443: calling self._execute() 30529 1726882638.27497: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.27510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.27526: variable 'omit' from source: magic vars 30529 1726882638.28047: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.28187: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.28196: variable 'omit' from source: magic vars 30529 1726882638.28200: variable 'omit' from source: magic vars 30529 1726882638.28243: variable 'omit' from source: magic vars 30529 1726882638.28288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882638.28399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882638.28402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882638.28406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.28413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.28460: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882638.28471: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.28510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.28760: Set connection var ansible_shell_executable to /bin/sh 30529 1726882638.28764: Set connection var ansible_pipelining to False 30529 1726882638.28766: Set connection var ansible_shell_type to sh 30529 1726882638.28781: Set connection var ansible_timeout to 10 30529 1726882638.28784: Set connection var ansible_connection to ssh 30529 1726882638.28802: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882638.28818: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.28821: variable 'ansible_connection' from source: unknown 30529 1726882638.28823: variable 'ansible_module_compression' from source: unknown 30529 1726882638.28826: variable 'ansible_shell_type' from source: unknown 30529 1726882638.28828: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.28830: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.28832: variable 'ansible_pipelining' from source: unknown 30529 1726882638.28836: variable 'ansible_timeout' from source: unknown 30529 1726882638.28839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.29058: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882638.29065: variable 'omit' from source: magic vars 30529 1726882638.29070: starting attempt loop 30529 1726882638.29073: running the handler 30529 1726882638.29086: _low_level_execute_command(): starting 30529 1726882638.29095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882638.29651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.29656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882638.29718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.29721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882638.29723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882638.29736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.29804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.31439: stdout chunk (state=3): >>>/root <<< 30529 1726882638.31543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.31566: stderr chunk (state=3): >>><<< 30529 1726882638.31569: stdout chunk (state=3): >>><<< 30529 1726882638.31588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882638.31602: _low_level_execute_command(): starting 30529 1726882638.31608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861 `" && echo ansible-tmp-1726882638.315867-33107-30669750464861="` echo /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861 `" ) && sleep 0' 30529 1726882638.32047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882638.32050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.32053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882638.32064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.32081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882638.32098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.32149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.34024: stdout chunk (state=3): >>>ansible-tmp-1726882638.315867-33107-30669750464861=/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861 <<< 30529 1726882638.34136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.34154: stderr chunk (state=3): >>><<< 30529 1726882638.34157: stdout chunk (state=3): >>><<< 30529 1726882638.34168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882638.315867-33107-30669750464861=/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882638.34207: variable 'ansible_module_compression' from source: unknown 30529 1726882638.34237: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882638.34263: variable 'ansible_facts' from source: unknown 30529 1726882638.34318: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py 30529 1726882638.34479: Sending initial data 30529 1726882638.34483: Sent initial data (151 bytes) 30529 1726882638.35047: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882638.35050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882638.35113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.35157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882638.35172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882638.35263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.35317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.36820: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882638.36832: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882638.36862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882638.36904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcsqmiqmk /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py <<< 30529 1726882638.36909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py" <<< 30529 1726882638.36943: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcsqmiqmk" to remote "/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py" <<< 30529 1726882638.36947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py" <<< 30529 1726882638.37458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.37501: stderr chunk (state=3): >>><<< 30529 1726882638.37504: stdout chunk (state=3): >>><<< 30529 1726882638.37542: done transferring module to remote 30529 1726882638.37551: _low_level_execute_command(): starting 30529 1726882638.37555: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/ /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py && sleep 0' 30529 1726882638.38154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882638.38157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882638.38160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882638.38162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882638.38165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882638.38167: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882638.38169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.38171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882638.38173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882638.38175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882638.38210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.38254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882638.38298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882638.38428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.38564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.40499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.40502: stdout chunk (state=3): >>><<< 30529 1726882638.40505: stderr chunk (state=3): >>><<< 30529 1726882638.40507: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882638.40510: _low_level_execute_command(): starting 30529 1726882638.40512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/AnsiballZ_ping.py && sleep 0' 30529 1726882638.40883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882638.40896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882638.40904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882638.40919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882638.40931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882638.40938: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882638.40952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.40962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882638.40969: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882638.40976: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882638.40984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882638.40995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882638.41008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882638.41016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882638.41065: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882638.41069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.41096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882638.41113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882638.41134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.41203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.56099: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882638.57316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882638.57348: stderr chunk (state=3): >>><<< 30529 1726882638.57352: stdout chunk (state=3): >>><<< 30529 1726882638.57365: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882638.57387: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882638.57504: _low_level_execute_command(): starting 30529 1726882638.57510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882638.315867-33107-30669750464861/ > /dev/null 2>&1 && sleep 0' 30529 1726882638.58252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.58258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882638.58338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882638.58343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882638.58380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882638.60144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882638.60172: stderr chunk (state=3): >>><<< 30529 1726882638.60178: stdout chunk (state=3): >>><<< 30529 1726882638.60213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882638.60216: handler run complete 30529 1726882638.60222: attempt loop complete, returning result 30529 1726882638.60224: _execute() done 30529 1726882638.60227: dumping result to json 30529 1726882638.60231: done dumping result, returning 30529 1726882638.60242: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-00000000110a] 30529 1726882638.60245: sending task result for task 12673a56-9f93-b0f1-edc0-00000000110a 30529 1726882638.60346: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000110a 30529 1726882638.60349: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882638.60417: no more pending results, returning what we have 30529 1726882638.60422: results queue empty 30529 1726882638.60423: checking for any_errors_fatal 30529 1726882638.60430: done checking for any_errors_fatal 30529 1726882638.60431: checking for max_fail_percentage 30529 1726882638.60433: done checking for max_fail_percentage 30529 1726882638.60434: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.60435: done checking to see if all hosts have failed 30529 1726882638.60435: getting the remaining hosts for this loop 30529 1726882638.60437: done getting the remaining hosts for this loop 30529 1726882638.60441: getting the next task for host managed_node1 30529 1726882638.60455: done getting next task for host managed_node1 30529 1726882638.60457: ^ task is: TASK: meta (role_complete) 30529 1726882638.60463: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.60483: getting variables 30529 1726882638.60486: in VariableManager get_vars() 30529 1726882638.60537: Calling all_inventory to load vars for managed_node1 30529 1726882638.60540: Calling groups_inventory to load vars for managed_node1 30529 1726882638.60542: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.60552: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.60556: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.60558: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.61594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.62452: done with get_vars() 30529 1726882638.62467: done getting variables 30529 1726882638.62528: done queuing things up, now waiting for results queue to drain 30529 1726882638.62529: results queue empty 30529 1726882638.62530: checking for any_errors_fatal 30529 1726882638.62532: done checking for any_errors_fatal 30529 1726882638.62532: checking for max_fail_percentage 30529 1726882638.62533: done checking for max_fail_percentage 30529 1726882638.62533: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.62534: done checking to see if all hosts have failed 30529 1726882638.62534: getting the remaining hosts for this loop 30529 1726882638.62535: done getting the remaining hosts for this loop 30529 1726882638.62536: getting the next task for host managed_node1 30529 1726882638.62539: done getting next task for host managed_node1 30529 1726882638.62540: ^ task is: TASK: Show result 30529 1726882638.62542: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.62544: getting variables 30529 1726882638.62544: in VariableManager get_vars() 30529 1726882638.62551: Calling all_inventory to load vars for managed_node1 30529 1726882638.62553: Calling groups_inventory to load vars for managed_node1 30529 1726882638.62555: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.62559: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.62560: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.62562: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.63170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.64631: done with get_vars() 30529 1726882638.64653: done getting variables 30529 1726882638.64690: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:37:18 -0400 (0:00:00.382) 0:00:52.673 ****** 30529 1726882638.64717: entering _queue_task() for managed_node1/debug 30529 1726882638.65046: worker is 1 (out of 1 available) 30529 1726882638.65059: exiting _queue_task() for managed_node1/debug 30529 1726882638.65073: done queuing things up, now waiting for results queue to drain 30529 1726882638.65075: waiting for pending results... 30529 1726882638.65361: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882638.65437: in run() - task 12673a56-9f93-b0f1-edc0-000000001090 30529 1726882638.65474: variable 'ansible_search_path' from source: unknown 30529 1726882638.65479: variable 'ansible_search_path' from source: unknown 30529 1726882638.65532: calling self._execute() 30529 1726882638.65600: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.65603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.65612: variable 'omit' from source: magic vars 30529 1726882638.65946: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.65955: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.65962: variable 'omit' from source: magic vars 30529 1726882638.65997: variable 'omit' from source: magic vars 30529 1726882638.66036: variable 'omit' from source: magic vars 30529 1726882638.66064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882638.66094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882638.66108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882638.66125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.66136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882638.66158: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882638.66162: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.66164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.66240: Set connection var ansible_shell_executable to /bin/sh 30529 1726882638.66244: Set connection var ansible_pipelining to False 30529 1726882638.66247: Set connection var ansible_shell_type to sh 30529 1726882638.66255: Set connection var ansible_timeout to 10 30529 1726882638.66257: Set connection var ansible_connection to ssh 30529 1726882638.66263: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882638.66280: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.66284: variable 'ansible_connection' from source: unknown 30529 1726882638.66287: variable 'ansible_module_compression' from source: unknown 30529 1726882638.66291: variable 'ansible_shell_type' from source: unknown 30529 1726882638.66295: variable 'ansible_shell_executable' from source: unknown 30529 1726882638.66298: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.66300: variable 'ansible_pipelining' from source: unknown 30529 1726882638.66302: variable 'ansible_timeout' from source: unknown 30529 1726882638.66305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.66404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882638.66412: variable 'omit' from source: magic vars 30529 1726882638.66418: starting attempt loop 30529 1726882638.66421: running the handler 30529 1726882638.66460: variable '__network_connections_result' from source: set_fact 30529 1726882638.66518: variable '__network_connections_result' from source: set_fact 30529 1726882638.66614: handler run complete 30529 1726882638.66631: attempt loop complete, returning result 30529 1726882638.66633: _execute() done 30529 1726882638.66636: dumping result to json 30529 1726882638.66638: done dumping result, returning 30529 1726882638.66646: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-000000001090] 30529 1726882638.66650: sending task result for task 12673a56-9f93-b0f1-edc0-000000001090 30529 1726882638.66741: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001090 30529 1726882638.66744: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239" ] } } 30529 1726882638.66843: no more pending results, returning what we have 30529 1726882638.66846: results queue empty 30529 1726882638.66847: checking for any_errors_fatal 30529 1726882638.66849: done checking for any_errors_fatal 30529 1726882638.66850: checking for max_fail_percentage 30529 1726882638.66851: done checking for max_fail_percentage 30529 1726882638.66852: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.66853: done checking to see if all hosts have failed 30529 1726882638.66854: getting the remaining hosts for this loop 30529 1726882638.66856: done getting the remaining hosts for this loop 30529 1726882638.66859: getting the next task for host managed_node1 30529 1726882638.66867: done getting next task for host managed_node1 30529 1726882638.66870: ^ task is: TASK: Include network role 30529 1726882638.66873: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.66876: getting variables 30529 1726882638.66877: in VariableManager get_vars() 30529 1726882638.66908: Calling all_inventory to load vars for managed_node1 30529 1726882638.66910: Calling groups_inventory to load vars for managed_node1 30529 1726882638.66913: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.66921: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.66924: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.66927: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.72020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.73015: done with get_vars() 30529 1726882638.73031: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:37:18 -0400 (0:00:00.083) 0:00:52.756 ****** 30529 1726882638.73084: entering _queue_task() for managed_node1/include_role 30529 1726882638.73476: worker is 1 (out of 1 available) 30529 1726882638.73491: exiting _queue_task() for managed_node1/include_role 30529 1726882638.73696: done queuing things up, now waiting for results queue to drain 30529 1726882638.73698: waiting for pending results... 30529 1726882638.73841: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882638.73962: in run() - task 12673a56-9f93-b0f1-edc0-000000001094 30529 1726882638.73976: variable 'ansible_search_path' from source: unknown 30529 1726882638.73981: variable 'ansible_search_path' from source: unknown 30529 1726882638.74044: calling self._execute() 30529 1726882638.74141: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.74146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.74155: variable 'omit' from source: magic vars 30529 1726882638.74598: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.74602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.74605: _execute() done 30529 1726882638.74607: dumping result to json 30529 1726882638.74611: done dumping result, returning 30529 1726882638.74614: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000001094] 30529 1726882638.74617: sending task result for task 12673a56-9f93-b0f1-edc0-000000001094 30529 1726882638.74732: no more pending results, returning what we have 30529 1726882638.74737: in VariableManager get_vars() 30529 1726882638.74775: Calling all_inventory to load vars for managed_node1 30529 1726882638.74778: Calling groups_inventory to load vars for managed_node1 30529 1726882638.74781: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.75100: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.75105: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.75110: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.75705: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001094 30529 1726882638.75709: WORKER PROCESS EXITING 30529 1726882638.76281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.77221: done with get_vars() 30529 1726882638.77234: variable 'ansible_search_path' from source: unknown 30529 1726882638.77235: variable 'ansible_search_path' from source: unknown 30529 1726882638.77327: variable 'omit' from source: magic vars 30529 1726882638.77356: variable 'omit' from source: magic vars 30529 1726882638.77367: variable 'omit' from source: magic vars 30529 1726882638.77369: we have included files to process 30529 1726882638.77370: generating all_blocks data 30529 1726882638.77372: done generating all_blocks data 30529 1726882638.77374: processing included file: fedora.linux_system_roles.network 30529 1726882638.77387: in VariableManager get_vars() 30529 1726882638.77399: done with get_vars() 30529 1726882638.77418: in VariableManager get_vars() 30529 1726882638.77429: done with get_vars() 30529 1726882638.77455: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882638.77560: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882638.77643: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882638.78113: in VariableManager get_vars() 30529 1726882638.78131: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882638.80060: iterating over new_blocks loaded from include file 30529 1726882638.80062: in VariableManager get_vars() 30529 1726882638.80078: done with get_vars() 30529 1726882638.80080: filtering new block on tags 30529 1726882638.80373: done filtering new block on tags 30529 1726882638.80377: in VariableManager get_vars() 30529 1726882638.80396: done with get_vars() 30529 1726882638.80398: filtering new block on tags 30529 1726882638.80415: done filtering new block on tags 30529 1726882638.80418: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882638.80424: extending task lists for all hosts with included blocks 30529 1726882638.80535: done extending task lists 30529 1726882638.80537: done processing included files 30529 1726882638.80538: results queue empty 30529 1726882638.80538: checking for any_errors_fatal 30529 1726882638.80543: done checking for any_errors_fatal 30529 1726882638.80543: checking for max_fail_percentage 30529 1726882638.80544: done checking for max_fail_percentage 30529 1726882638.80545: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.80546: done checking to see if all hosts have failed 30529 1726882638.80547: getting the remaining hosts for this loop 30529 1726882638.80548: done getting the remaining hosts for this loop 30529 1726882638.80551: getting the next task for host managed_node1 30529 1726882638.80555: done getting next task for host managed_node1 30529 1726882638.80558: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882638.80561: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.80572: getting variables 30529 1726882638.80573: in VariableManager get_vars() 30529 1726882638.80585: Calling all_inventory to load vars for managed_node1 30529 1726882638.80587: Calling groups_inventory to load vars for managed_node1 30529 1726882638.80592: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.80599: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.80602: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.80605: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.81774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.82680: done with get_vars() 30529 1726882638.82699: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:18 -0400 (0:00:00.096) 0:00:52.853 ****** 30529 1726882638.82751: entering _queue_task() for managed_node1/include_tasks 30529 1726882638.83021: worker is 1 (out of 1 available) 30529 1726882638.83034: exiting _queue_task() for managed_node1/include_tasks 30529 1726882638.83048: done queuing things up, now waiting for results queue to drain 30529 1726882638.83049: waiting for pending results... 30529 1726882638.83231: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882638.83313: in run() - task 12673a56-9f93-b0f1-edc0-00000000127a 30529 1726882638.83326: variable 'ansible_search_path' from source: unknown 30529 1726882638.83329: variable 'ansible_search_path' from source: unknown 30529 1726882638.83356: calling self._execute() 30529 1726882638.83432: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.83436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.83445: variable 'omit' from source: magic vars 30529 1726882638.83717: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.83734: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.83740: _execute() done 30529 1726882638.83744: dumping result to json 30529 1726882638.83746: done dumping result, returning 30529 1726882638.83754: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-00000000127a] 30529 1726882638.83758: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127a 30529 1726882638.83851: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127a 30529 1726882638.83854: WORKER PROCESS EXITING 30529 1726882638.83907: no more pending results, returning what we have 30529 1726882638.83913: in VariableManager get_vars() 30529 1726882638.83955: Calling all_inventory to load vars for managed_node1 30529 1726882638.83957: Calling groups_inventory to load vars for managed_node1 30529 1726882638.83959: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.83971: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.83974: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.83976: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.85336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.86209: done with get_vars() 30529 1726882638.86223: variable 'ansible_search_path' from source: unknown 30529 1726882638.86224: variable 'ansible_search_path' from source: unknown 30529 1726882638.86250: we have included files to process 30529 1726882638.86251: generating all_blocks data 30529 1726882638.86252: done generating all_blocks data 30529 1726882638.86254: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882638.86255: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882638.86256: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882638.86636: done processing included file 30529 1726882638.86637: iterating over new_blocks loaded from include file 30529 1726882638.86638: in VariableManager get_vars() 30529 1726882638.86654: done with get_vars() 30529 1726882638.86655: filtering new block on tags 30529 1726882638.86674: done filtering new block on tags 30529 1726882638.86676: in VariableManager get_vars() 30529 1726882638.86688: done with get_vars() 30529 1726882638.86689: filtering new block on tags 30529 1726882638.86721: done filtering new block on tags 30529 1726882638.86723: in VariableManager get_vars() 30529 1726882638.86736: done with get_vars() 30529 1726882638.86737: filtering new block on tags 30529 1726882638.86760: done filtering new block on tags 30529 1726882638.86762: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882638.86765: extending task lists for all hosts with included blocks 30529 1726882638.88257: done extending task lists 30529 1726882638.88259: done processing included files 30529 1726882638.88260: results queue empty 30529 1726882638.88260: checking for any_errors_fatal 30529 1726882638.88263: done checking for any_errors_fatal 30529 1726882638.88264: checking for max_fail_percentage 30529 1726882638.88265: done checking for max_fail_percentage 30529 1726882638.88266: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.88266: done checking to see if all hosts have failed 30529 1726882638.88267: getting the remaining hosts for this loop 30529 1726882638.88268: done getting the remaining hosts for this loop 30529 1726882638.88271: getting the next task for host managed_node1 30529 1726882638.88276: done getting next task for host managed_node1 30529 1726882638.88279: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882638.88283: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.88301: getting variables 30529 1726882638.88303: in VariableManager get_vars() 30529 1726882638.88317: Calling all_inventory to load vars for managed_node1 30529 1726882638.88319: Calling groups_inventory to load vars for managed_node1 30529 1726882638.88321: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.88327: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.88330: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.88333: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.89062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.89973: done with get_vars() 30529 1726882638.89988: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:18 -0400 (0:00:00.073) 0:00:52.926 ****** 30529 1726882638.90058: entering _queue_task() for managed_node1/setup 30529 1726882638.90407: worker is 1 (out of 1 available) 30529 1726882638.90420: exiting _queue_task() for managed_node1/setup 30529 1726882638.90433: done queuing things up, now waiting for results queue to drain 30529 1726882638.90435: waiting for pending results... 30529 1726882638.90723: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882638.90820: in run() - task 12673a56-9f93-b0f1-edc0-0000000012d1 30529 1726882638.90824: variable 'ansible_search_path' from source: unknown 30529 1726882638.90827: variable 'ansible_search_path' from source: unknown 30529 1726882638.90967: calling self._execute() 30529 1726882638.90972: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.90975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.90978: variable 'omit' from source: magic vars 30529 1726882638.91498: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.91502: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.91527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882638.93330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882638.93381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882638.93412: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882638.93439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882638.93459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882638.93520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882638.93545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882638.93561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882638.93587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882638.93603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882638.93640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882638.93661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882638.93677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882638.93706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882638.93717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882638.93825: variable '__network_required_facts' from source: role '' defaults 30529 1726882638.93832: variable 'ansible_facts' from source: unknown 30529 1726882638.94498: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882638.94502: when evaluation is False, skipping this task 30529 1726882638.94505: _execute() done 30529 1726882638.94507: dumping result to json 30529 1726882638.94510: done dumping result, returning 30529 1726882638.94514: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-0000000012d1] 30529 1726882638.94516: sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d1 30529 1726882638.94584: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d1 30529 1726882638.94587: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882638.94635: no more pending results, returning what we have 30529 1726882638.94639: results queue empty 30529 1726882638.94640: checking for any_errors_fatal 30529 1726882638.94642: done checking for any_errors_fatal 30529 1726882638.94643: checking for max_fail_percentage 30529 1726882638.94644: done checking for max_fail_percentage 30529 1726882638.94645: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.94646: done checking to see if all hosts have failed 30529 1726882638.94647: getting the remaining hosts for this loop 30529 1726882638.94649: done getting the remaining hosts for this loop 30529 1726882638.94652: getting the next task for host managed_node1 30529 1726882638.94664: done getting next task for host managed_node1 30529 1726882638.94668: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882638.94674: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.94701: getting variables 30529 1726882638.94703: in VariableManager get_vars() 30529 1726882638.94742: Calling all_inventory to load vars for managed_node1 30529 1726882638.94745: Calling groups_inventory to load vars for managed_node1 30529 1726882638.94747: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.94758: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.94761: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.94770: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.95842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882638.96716: done with get_vars() 30529 1726882638.96731: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:18 -0400 (0:00:00.067) 0:00:52.994 ****** 30529 1726882638.96804: entering _queue_task() for managed_node1/stat 30529 1726882638.97032: worker is 1 (out of 1 available) 30529 1726882638.97047: exiting _queue_task() for managed_node1/stat 30529 1726882638.97060: done queuing things up, now waiting for results queue to drain 30529 1726882638.97061: waiting for pending results... 30529 1726882638.97240: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882638.97352: in run() - task 12673a56-9f93-b0f1-edc0-0000000012d3 30529 1726882638.97374: variable 'ansible_search_path' from source: unknown 30529 1726882638.97381: variable 'ansible_search_path' from source: unknown 30529 1726882638.97424: calling self._execute() 30529 1726882638.97521: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882638.97526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882638.97537: variable 'omit' from source: magic vars 30529 1726882638.97974: variable 'ansible_distribution_major_version' from source: facts 30529 1726882638.98005: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882638.98217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882638.98433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882638.98469: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882638.98532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882638.98554: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882638.98664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882638.98683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882638.98718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882638.98747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882638.98873: variable '__network_is_ostree' from source: set_fact 30529 1726882638.98880: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882638.98883: when evaluation is False, skipping this task 30529 1726882638.98886: _execute() done 30529 1726882638.98888: dumping result to json 30529 1726882638.98892: done dumping result, returning 30529 1726882638.98898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-0000000012d3] 30529 1726882638.98900: sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d3 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882638.99105: no more pending results, returning what we have 30529 1726882638.99109: results queue empty 30529 1726882638.99110: checking for any_errors_fatal 30529 1726882638.99115: done checking for any_errors_fatal 30529 1726882638.99116: checking for max_fail_percentage 30529 1726882638.99119: done checking for max_fail_percentage 30529 1726882638.99120: checking to see if all hosts have failed and the running result is not ok 30529 1726882638.99121: done checking to see if all hosts have failed 30529 1726882638.99121: getting the remaining hosts for this loop 30529 1726882638.99123: done getting the remaining hosts for this loop 30529 1726882638.99128: getting the next task for host managed_node1 30529 1726882638.99135: done getting next task for host managed_node1 30529 1726882638.99140: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882638.99146: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882638.99166: getting variables 30529 1726882638.99168: in VariableManager get_vars() 30529 1726882638.99202: Calling all_inventory to load vars for managed_node1 30529 1726882638.99205: Calling groups_inventory to load vars for managed_node1 30529 1726882638.99207: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882638.99219: Calling all_plugins_play to load vars for managed_node1 30529 1726882638.99223: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882638.99229: Calling groups_plugins_play to load vars for managed_node1 30529 1726882638.99750: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d3 30529 1726882638.99756: WORKER PROCESS EXITING 30529 1726882639.00762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882639.01942: done with get_vars() 30529 1726882639.01964: done getting variables 30529 1726882639.02027: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:19 -0400 (0:00:00.052) 0:00:53.046 ****** 30529 1726882639.02058: entering _queue_task() for managed_node1/set_fact 30529 1726882639.02288: worker is 1 (out of 1 available) 30529 1726882639.02306: exiting _queue_task() for managed_node1/set_fact 30529 1726882639.02324: done queuing things up, now waiting for results queue to drain 30529 1726882639.02327: waiting for pending results... 30529 1726882639.02548: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882639.02648: in run() - task 12673a56-9f93-b0f1-edc0-0000000012d4 30529 1726882639.02660: variable 'ansible_search_path' from source: unknown 30529 1726882639.02663: variable 'ansible_search_path' from source: unknown 30529 1726882639.02689: calling self._execute() 30529 1726882639.02757: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882639.02761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882639.02769: variable 'omit' from source: magic vars 30529 1726882639.03028: variable 'ansible_distribution_major_version' from source: facts 30529 1726882639.03037: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882639.03151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882639.03340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882639.03375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882639.03403: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882639.03429: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882639.03491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882639.03512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882639.03529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882639.03546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882639.03617: variable '__network_is_ostree' from source: set_fact 30529 1726882639.03622: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882639.03625: when evaluation is False, skipping this task 30529 1726882639.03627: _execute() done 30529 1726882639.03630: dumping result to json 30529 1726882639.03632: done dumping result, returning 30529 1726882639.03641: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-0000000012d4] 30529 1726882639.03643: sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d4 30529 1726882639.03726: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d4 30529 1726882639.03728: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882639.03774: no more pending results, returning what we have 30529 1726882639.03778: results queue empty 30529 1726882639.03779: checking for any_errors_fatal 30529 1726882639.03786: done checking for any_errors_fatal 30529 1726882639.03786: checking for max_fail_percentage 30529 1726882639.03788: done checking for max_fail_percentage 30529 1726882639.03789: checking to see if all hosts have failed and the running result is not ok 30529 1726882639.03789: done checking to see if all hosts have failed 30529 1726882639.03790: getting the remaining hosts for this loop 30529 1726882639.03792: done getting the remaining hosts for this loop 30529 1726882639.03797: getting the next task for host managed_node1 30529 1726882639.03807: done getting next task for host managed_node1 30529 1726882639.03811: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882639.03843: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882639.03863: getting variables 30529 1726882639.03865: in VariableManager get_vars() 30529 1726882639.03897: Calling all_inventory to load vars for managed_node1 30529 1726882639.03899: Calling groups_inventory to load vars for managed_node1 30529 1726882639.03901: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882639.03909: Calling all_plugins_play to load vars for managed_node1 30529 1726882639.03911: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882639.03914: Calling groups_plugins_play to load vars for managed_node1 30529 1726882639.04643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882639.06061: done with get_vars() 30529 1726882639.06087: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:19 -0400 (0:00:00.041) 0:00:53.087 ****** 30529 1726882639.06172: entering _queue_task() for managed_node1/service_facts 30529 1726882639.06477: worker is 1 (out of 1 available) 30529 1726882639.06492: exiting _queue_task() for managed_node1/service_facts 30529 1726882639.06511: done queuing things up, now waiting for results queue to drain 30529 1726882639.06513: waiting for pending results... 30529 1726882639.06767: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882639.07079: in run() - task 12673a56-9f93-b0f1-edc0-0000000012d6 30529 1726882639.07083: variable 'ansible_search_path' from source: unknown 30529 1726882639.07085: variable 'ansible_search_path' from source: unknown 30529 1726882639.07088: calling self._execute() 30529 1726882639.07099: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882639.07105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882639.07114: variable 'omit' from source: magic vars 30529 1726882639.07460: variable 'ansible_distribution_major_version' from source: facts 30529 1726882639.07494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882639.07499: variable 'omit' from source: magic vars 30529 1726882639.07572: variable 'omit' from source: magic vars 30529 1726882639.07600: variable 'omit' from source: magic vars 30529 1726882639.07628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882639.07670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882639.07681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882639.07697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882639.07710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882639.07732: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882639.07735: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882639.07738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882639.07830: Set connection var ansible_shell_executable to /bin/sh 30529 1726882639.07833: Set connection var ansible_pipelining to False 30529 1726882639.07836: Set connection var ansible_shell_type to sh 30529 1726882639.07844: Set connection var ansible_timeout to 10 30529 1726882639.07846: Set connection var ansible_connection to ssh 30529 1726882639.07851: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882639.07872: variable 'ansible_shell_executable' from source: unknown 30529 1726882639.07875: variable 'ansible_connection' from source: unknown 30529 1726882639.07878: variable 'ansible_module_compression' from source: unknown 30529 1726882639.07880: variable 'ansible_shell_type' from source: unknown 30529 1726882639.07882: variable 'ansible_shell_executable' from source: unknown 30529 1726882639.07885: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882639.07887: variable 'ansible_pipelining' from source: unknown 30529 1726882639.07890: variable 'ansible_timeout' from source: unknown 30529 1726882639.07898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882639.08040: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882639.08048: variable 'omit' from source: magic vars 30529 1726882639.08053: starting attempt loop 30529 1726882639.08056: running the handler 30529 1726882639.08067: _low_level_execute_command(): starting 30529 1726882639.08075: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882639.08614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882639.08617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.08620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882639.08622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882639.08626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.08667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882639.08670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882639.08725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882639.10364: stdout chunk (state=3): >>>/root <<< 30529 1726882639.10459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882639.10510: stderr chunk (state=3): >>><<< 30529 1726882639.10513: stdout chunk (state=3): >>><<< 30529 1726882639.10544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882639.10547: _low_level_execute_command(): starting 30529 1726882639.10565: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883 `" && echo ansible-tmp-1726882639.1053212-33141-164704235320883="` echo /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883 `" ) && sleep 0' 30529 1726882639.11125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882639.11128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882639.11138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882639.11140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.11186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882639.11191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882639.11199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882639.11234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882639.13098: stdout chunk (state=3): >>>ansible-tmp-1726882639.1053212-33141-164704235320883=/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883 <<< 30529 1726882639.13212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882639.13238: stderr chunk (state=3): >>><<< 30529 1726882639.13242: stdout chunk (state=3): >>><<< 30529 1726882639.13255: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882639.1053212-33141-164704235320883=/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882639.13289: variable 'ansible_module_compression' from source: unknown 30529 1726882639.13326: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882639.13360: variable 'ansible_facts' from source: unknown 30529 1726882639.13414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py 30529 1726882639.13507: Sending initial data 30529 1726882639.13510: Sent initial data (162 bytes) 30529 1726882639.13937: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882639.13940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.13943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882639.13945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882639.13947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.14005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882639.14008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882639.14010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882639.14040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882639.15559: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882639.15562: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882639.15601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882639.15642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpadnlbg5l /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py <<< 30529 1726882639.15644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py" <<< 30529 1726882639.15679: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpadnlbg5l" to remote "/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py" <<< 30529 1726882639.15685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py" <<< 30529 1726882639.16211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882639.16242: stderr chunk (state=3): >>><<< 30529 1726882639.16246: stdout chunk (state=3): >>><<< 30529 1726882639.16267: done transferring module to remote 30529 1726882639.16276: _low_level_execute_command(): starting 30529 1726882639.16278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/ /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py && sleep 0' 30529 1726882639.16668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882639.16711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882639.16715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882639.16717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.16719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882639.16725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882639.16727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.16766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882639.16769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882639.16773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882639.16823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882639.18534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882639.18552: stderr chunk (state=3): >>><<< 30529 1726882639.18555: stdout chunk (state=3): >>><<< 30529 1726882639.18567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882639.18570: _low_level_execute_command(): starting 30529 1726882639.18572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/AnsiballZ_service_facts.py && sleep 0' 30529 1726882639.18971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882639.18974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.18976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882639.18978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882639.19025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882639.19029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882639.19080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882640.69770: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30529 1726882640.69787: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882640.71278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882640.71295: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882640.71360: stderr chunk (state=3): >>><<< 30529 1726882640.71398: stdout chunk (state=3): >>><<< 30529 1726882640.71600: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882640.72569: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882640.72587: _low_level_execute_command(): starting 30529 1726882640.72608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882639.1053212-33141-164704235320883/ > /dev/null 2>&1 && sleep 0' 30529 1726882640.73265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882640.73281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882640.73306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882640.73380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882640.73431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882640.73456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882640.73496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882640.73600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882640.75577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882640.75580: stdout chunk (state=3): >>><<< 30529 1726882640.75583: stderr chunk (state=3): >>><<< 30529 1726882640.75877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882640.75881: handler run complete 30529 1726882640.76064: variable 'ansible_facts' from source: unknown 30529 1726882640.76451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882640.77504: variable 'ansible_facts' from source: unknown 30529 1726882640.77741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882640.78267: attempt loop complete, returning result 30529 1726882640.78278: _execute() done 30529 1726882640.78284: dumping result to json 30529 1726882640.78356: done dumping result, returning 30529 1726882640.78582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-0000000012d6] 30529 1726882640.78587: sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d6 30529 1726882640.80953: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d6 30529 1726882640.80957: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882640.81069: no more pending results, returning what we have 30529 1726882640.81071: results queue empty 30529 1726882640.81072: checking for any_errors_fatal 30529 1726882640.81075: done checking for any_errors_fatal 30529 1726882640.81075: checking for max_fail_percentage 30529 1726882640.81077: done checking for max_fail_percentage 30529 1726882640.81077: checking to see if all hosts have failed and the running result is not ok 30529 1726882640.81078: done checking to see if all hosts have failed 30529 1726882640.81079: getting the remaining hosts for this loop 30529 1726882640.81080: done getting the remaining hosts for this loop 30529 1726882640.81083: getting the next task for host managed_node1 30529 1726882640.81088: done getting next task for host managed_node1 30529 1726882640.81091: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882640.81099: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882640.81110: getting variables 30529 1726882640.81112: in VariableManager get_vars() 30529 1726882640.81139: Calling all_inventory to load vars for managed_node1 30529 1726882640.81142: Calling groups_inventory to load vars for managed_node1 30529 1726882640.81144: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882640.81153: Calling all_plugins_play to load vars for managed_node1 30529 1726882640.81156: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882640.81165: Calling groups_plugins_play to load vars for managed_node1 30529 1726882640.83458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882640.85196: done with get_vars() 30529 1726882640.85238: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:20 -0400 (0:00:01.791) 0:00:54.879 ****** 30529 1726882640.85349: entering _queue_task() for managed_node1/package_facts 30529 1726882640.86206: worker is 1 (out of 1 available) 30529 1726882640.86220: exiting _queue_task() for managed_node1/package_facts 30529 1726882640.86349: done queuing things up, now waiting for results queue to drain 30529 1726882640.86351: waiting for pending results... 30529 1726882640.86794: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882640.87023: in run() - task 12673a56-9f93-b0f1-edc0-0000000012d7 30529 1726882640.87038: variable 'ansible_search_path' from source: unknown 30529 1726882640.87042: variable 'ansible_search_path' from source: unknown 30529 1726882640.87079: calling self._execute() 30529 1726882640.87176: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882640.87180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882640.87190: variable 'omit' from source: magic vars 30529 1726882640.87961: variable 'ansible_distribution_major_version' from source: facts 30529 1726882640.87999: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882640.88002: variable 'omit' from source: magic vars 30529 1726882640.88266: variable 'omit' from source: magic vars 30529 1726882640.88324: variable 'omit' from source: magic vars 30529 1726882640.88340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882640.88376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882640.88616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882640.88620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882640.88631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882640.88659: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882640.88662: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882640.88666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882640.88834: Set connection var ansible_shell_executable to /bin/sh 30529 1726882640.88838: Set connection var ansible_pipelining to False 30529 1726882640.88840: Set connection var ansible_shell_type to sh 30529 1726882640.88842: Set connection var ansible_timeout to 10 30529 1726882640.88844: Set connection var ansible_connection to ssh 30529 1726882640.88846: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882640.89019: variable 'ansible_shell_executable' from source: unknown 30529 1726882640.89022: variable 'ansible_connection' from source: unknown 30529 1726882640.89025: variable 'ansible_module_compression' from source: unknown 30529 1726882640.89028: variable 'ansible_shell_type' from source: unknown 30529 1726882640.89030: variable 'ansible_shell_executable' from source: unknown 30529 1726882640.89032: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882640.89034: variable 'ansible_pipelining' from source: unknown 30529 1726882640.89037: variable 'ansible_timeout' from source: unknown 30529 1726882640.89041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882640.89433: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882640.89443: variable 'omit' from source: magic vars 30529 1726882640.89486: starting attempt loop 30529 1726882640.89489: running the handler 30529 1726882640.89491: _low_level_execute_command(): starting 30529 1726882640.89497: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882640.90813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882640.92324: stdout chunk (state=3): >>>/root <<< 30529 1726882640.92456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882640.92465: stderr chunk (state=3): >>><<< 30529 1726882640.92468: stdout chunk (state=3): >>><<< 30529 1726882640.92482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882640.92566: _low_level_execute_command(): starting 30529 1726882640.92577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761 `" && echo ansible-tmp-1726882640.9248188-33212-244735706222761="` echo /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761 `" ) && sleep 0' 30529 1726882640.93664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882640.93673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882640.93683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882640.93702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882640.93715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882640.93723: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882640.93733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882640.93747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882640.93904: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882640.93907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882640.93910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882640.93912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882640.93914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882640.93916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882640.94121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882640.94173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882640.96011: stdout chunk (state=3): >>>ansible-tmp-1726882640.9248188-33212-244735706222761=/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761 <<< 30529 1726882640.96122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882640.96164: stderr chunk (state=3): >>><<< 30529 1726882640.96310: stdout chunk (state=3): >>><<< 30529 1726882640.96399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882640.9248188-33212-244735706222761=/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882640.96410: variable 'ansible_module_compression' from source: unknown 30529 1726882640.96422: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882640.96483: variable 'ansible_facts' from source: unknown 30529 1726882640.96881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py 30529 1726882640.97511: Sending initial data 30529 1726882640.97515: Sent initial data (162 bytes) 30529 1726882640.98445: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882640.98510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882640.98520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882640.98534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882640.98552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882640.98555: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882640.98562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882640.98576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882640.98779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882640.98784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882640.98857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882641.00365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882641.00406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882641.00457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqvionx6y /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py <<< 30529 1726882641.00461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py" <<< 30529 1726882641.00708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqvionx6y" to remote "/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py" <<< 30529 1726882641.03477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882641.03999: stderr chunk (state=3): >>><<< 30529 1726882641.04004: stdout chunk (state=3): >>><<< 30529 1726882641.04006: done transferring module to remote 30529 1726882641.04007: _low_level_execute_command(): starting 30529 1726882641.04009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/ /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py && sleep 0' 30529 1726882641.04589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882641.04779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882641.04788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882641.04791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882641.04794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882641.06539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882641.06543: stdout chunk (state=3): >>><<< 30529 1726882641.06549: stderr chunk (state=3): >>><<< 30529 1726882641.06564: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882641.06567: _low_level_execute_command(): starting 30529 1726882641.06572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/AnsiballZ_package_facts.py && sleep 0' 30529 1726882641.07685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882641.07691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882641.07787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882641.07790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882641.07794: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882641.07797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882641.07800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882641.07940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882641.07944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882641.07991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882641.51864: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30529 1726882641.51941: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882641.53727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882641.53731: stdout chunk (state=3): >>><<< 30529 1726882641.53733: stderr chunk (state=3): >>><<< 30529 1726882641.53899: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882641.56488: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882641.56716: _low_level_execute_command(): starting 30529 1726882641.56719: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882640.9248188-33212-244735706222761/ > /dev/null 2>&1 && sleep 0' 30529 1726882641.57469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882641.57479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882641.57487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882641.57507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882641.57518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882641.57525: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882641.57534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882641.57548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882641.57554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882641.57586: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882641.57589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882641.57592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882641.57596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882641.57598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882641.57600: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882641.57667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882641.57676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882641.57695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882641.57767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882641.59684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882641.59829: stdout chunk (state=3): >>><<< 30529 1726882641.59832: stderr chunk (state=3): >>><<< 30529 1726882641.59835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882641.59837: handler run complete 30529 1726882641.61449: variable 'ansible_facts' from source: unknown 30529 1726882641.62873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.64800: variable 'ansible_facts' from source: unknown 30529 1726882641.65417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.66818: attempt loop complete, returning result 30529 1726882641.66822: _execute() done 30529 1726882641.66824: dumping result to json 30529 1726882641.67229: done dumping result, returning 30529 1726882641.67264: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-0000000012d7] 30529 1726882641.67274: sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d7 30529 1726882641.71636: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000012d7 30529 1726882641.71640: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882641.71801: no more pending results, returning what we have 30529 1726882641.71804: results queue empty 30529 1726882641.71805: checking for any_errors_fatal 30529 1726882641.71809: done checking for any_errors_fatal 30529 1726882641.71810: checking for max_fail_percentage 30529 1726882641.71811: done checking for max_fail_percentage 30529 1726882641.71812: checking to see if all hosts have failed and the running result is not ok 30529 1726882641.71813: done checking to see if all hosts have failed 30529 1726882641.71814: getting the remaining hosts for this loop 30529 1726882641.71815: done getting the remaining hosts for this loop 30529 1726882641.71818: getting the next task for host managed_node1 30529 1726882641.71825: done getting next task for host managed_node1 30529 1726882641.71828: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882641.71833: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882641.71843: getting variables 30529 1726882641.71845: in VariableManager get_vars() 30529 1726882641.71873: Calling all_inventory to load vars for managed_node1 30529 1726882641.71875: Calling groups_inventory to load vars for managed_node1 30529 1726882641.71882: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882641.71890: Calling all_plugins_play to load vars for managed_node1 30529 1726882641.71898: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882641.71902: Calling groups_plugins_play to load vars for managed_node1 30529 1726882641.74051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.75975: done with get_vars() 30529 1726882641.76130: done getting variables 30529 1726882641.76189: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:21 -0400 (0:00:00.908) 0:00:55.788 ****** 30529 1726882641.76340: entering _queue_task() for managed_node1/debug 30529 1726882641.76961: worker is 1 (out of 1 available) 30529 1726882641.76973: exiting _queue_task() for managed_node1/debug 30529 1726882641.76985: done queuing things up, now waiting for results queue to drain 30529 1726882641.77098: waiting for pending results... 30529 1726882641.77399: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882641.77599: in run() - task 12673a56-9f93-b0f1-edc0-00000000127b 30529 1726882641.77603: variable 'ansible_search_path' from source: unknown 30529 1726882641.77606: variable 'ansible_search_path' from source: unknown 30529 1726882641.77620: calling self._execute() 30529 1726882641.77719: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.77730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.77749: variable 'omit' from source: magic vars 30529 1726882641.78127: variable 'ansible_distribution_major_version' from source: facts 30529 1726882641.78188: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882641.78195: variable 'omit' from source: magic vars 30529 1726882641.78230: variable 'omit' from source: magic vars 30529 1726882641.78328: variable 'network_provider' from source: set_fact 30529 1726882641.78347: variable 'omit' from source: magic vars 30529 1726882641.78400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882641.78434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882641.78509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882641.78514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882641.78516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882641.78546: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882641.78556: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.78564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.78682: Set connection var ansible_shell_executable to /bin/sh 30529 1726882641.78695: Set connection var ansible_pipelining to False 30529 1726882641.78704: Set connection var ansible_shell_type to sh 30529 1726882641.78742: Set connection var ansible_timeout to 10 30529 1726882641.78745: Set connection var ansible_connection to ssh 30529 1726882641.78748: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882641.78771: variable 'ansible_shell_executable' from source: unknown 30529 1726882641.78835: variable 'ansible_connection' from source: unknown 30529 1726882641.78838: variable 'ansible_module_compression' from source: unknown 30529 1726882641.78841: variable 'ansible_shell_type' from source: unknown 30529 1726882641.78843: variable 'ansible_shell_executable' from source: unknown 30529 1726882641.78846: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.78848: variable 'ansible_pipelining' from source: unknown 30529 1726882641.78850: variable 'ansible_timeout' from source: unknown 30529 1726882641.78852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.78975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882641.78996: variable 'omit' from source: magic vars 30529 1726882641.79008: starting attempt loop 30529 1726882641.79052: running the handler 30529 1726882641.79074: handler run complete 30529 1726882641.79096: attempt loop complete, returning result 30529 1726882641.79160: _execute() done 30529 1726882641.79163: dumping result to json 30529 1726882641.79165: done dumping result, returning 30529 1726882641.79168: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-00000000127b] 30529 1726882641.79170: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127b 30529 1726882641.79248: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127b 30529 1726882641.79252: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882641.79331: no more pending results, returning what we have 30529 1726882641.79335: results queue empty 30529 1726882641.79337: checking for any_errors_fatal 30529 1726882641.79346: done checking for any_errors_fatal 30529 1726882641.79347: checking for max_fail_percentage 30529 1726882641.79349: done checking for max_fail_percentage 30529 1726882641.79350: checking to see if all hosts have failed and the running result is not ok 30529 1726882641.79351: done checking to see if all hosts have failed 30529 1726882641.79352: getting the remaining hosts for this loop 30529 1726882641.79354: done getting the remaining hosts for this loop 30529 1726882641.79358: getting the next task for host managed_node1 30529 1726882641.79368: done getting next task for host managed_node1 30529 1726882641.79486: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882641.79494: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882641.79509: getting variables 30529 1726882641.79511: in VariableManager get_vars() 30529 1726882641.79551: Calling all_inventory to load vars for managed_node1 30529 1726882641.79554: Calling groups_inventory to load vars for managed_node1 30529 1726882641.79557: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882641.79568: Calling all_plugins_play to load vars for managed_node1 30529 1726882641.79572: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882641.79575: Calling groups_plugins_play to load vars for managed_node1 30529 1726882641.81680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.83504: done with get_vars() 30529 1726882641.83531: done getting variables 30529 1726882641.83601: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:21 -0400 (0:00:00.074) 0:00:55.862 ****** 30529 1726882641.83645: entering _queue_task() for managed_node1/fail 30529 1726882641.84121: worker is 1 (out of 1 available) 30529 1726882641.84132: exiting _queue_task() for managed_node1/fail 30529 1726882641.84144: done queuing things up, now waiting for results queue to drain 30529 1726882641.84146: waiting for pending results... 30529 1726882641.84447: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882641.84545: in run() - task 12673a56-9f93-b0f1-edc0-00000000127c 30529 1726882641.84549: variable 'ansible_search_path' from source: unknown 30529 1726882641.84552: variable 'ansible_search_path' from source: unknown 30529 1726882641.84579: calling self._execute() 30529 1726882641.84685: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.84699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.84725: variable 'omit' from source: magic vars 30529 1726882641.85471: variable 'ansible_distribution_major_version' from source: facts 30529 1726882641.85580: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882641.85620: variable 'network_state' from source: role '' defaults 30529 1726882641.85702: Evaluated conditional (network_state != {}): False 30529 1726882641.85712: when evaluation is False, skipping this task 30529 1726882641.85718: _execute() done 30529 1726882641.85723: dumping result to json 30529 1726882641.85728: done dumping result, returning 30529 1726882641.85736: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-00000000127c] 30529 1726882641.85744: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882641.85881: no more pending results, returning what we have 30529 1726882641.85886: results queue empty 30529 1726882641.85887: checking for any_errors_fatal 30529 1726882641.85895: done checking for any_errors_fatal 30529 1726882641.85896: checking for max_fail_percentage 30529 1726882641.85898: done checking for max_fail_percentage 30529 1726882641.85899: checking to see if all hosts have failed and the running result is not ok 30529 1726882641.85900: done checking to see if all hosts have failed 30529 1726882641.85901: getting the remaining hosts for this loop 30529 1726882641.85903: done getting the remaining hosts for this loop 30529 1726882641.85907: getting the next task for host managed_node1 30529 1726882641.85918: done getting next task for host managed_node1 30529 1726882641.85923: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882641.85929: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882641.85958: getting variables 30529 1726882641.85960: in VariableManager get_vars() 30529 1726882641.86222: Calling all_inventory to load vars for managed_node1 30529 1726882641.86225: Calling groups_inventory to load vars for managed_node1 30529 1726882641.86228: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882641.86241: Calling all_plugins_play to load vars for managed_node1 30529 1726882641.86245: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882641.86248: Calling groups_plugins_play to load vars for managed_node1 30529 1726882641.86834: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127c 30529 1726882641.86838: WORKER PROCESS EXITING 30529 1726882641.88168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.89916: done with get_vars() 30529 1726882641.89939: done getting variables 30529 1726882641.89999: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:21 -0400 (0:00:00.063) 0:00:55.926 ****** 30529 1726882641.90042: entering _queue_task() for managed_node1/fail 30529 1726882641.90483: worker is 1 (out of 1 available) 30529 1726882641.90497: exiting _queue_task() for managed_node1/fail 30529 1726882641.90510: done queuing things up, now waiting for results queue to drain 30529 1726882641.90512: waiting for pending results... 30529 1726882641.90859: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882641.91026: in run() - task 12673a56-9f93-b0f1-edc0-00000000127d 30529 1726882641.91049: variable 'ansible_search_path' from source: unknown 30529 1726882641.91058: variable 'ansible_search_path' from source: unknown 30529 1726882641.91099: calling self._execute() 30529 1726882641.91201: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.91221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.91237: variable 'omit' from source: magic vars 30529 1726882641.91650: variable 'ansible_distribution_major_version' from source: facts 30529 1726882641.91654: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882641.91774: variable 'network_state' from source: role '' defaults 30529 1726882641.91790: Evaluated conditional (network_state != {}): False 30529 1726882641.91801: when evaluation is False, skipping this task 30529 1726882641.91867: _execute() done 30529 1726882641.91870: dumping result to json 30529 1726882641.91873: done dumping result, returning 30529 1726882641.91876: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-00000000127d] 30529 1726882641.91878: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127d 30529 1726882641.91953: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127d 30529 1726882641.91957: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882641.92023: no more pending results, returning what we have 30529 1726882641.92027: results queue empty 30529 1726882641.92028: checking for any_errors_fatal 30529 1726882641.92036: done checking for any_errors_fatal 30529 1726882641.92037: checking for max_fail_percentage 30529 1726882641.92039: done checking for max_fail_percentage 30529 1726882641.92040: checking to see if all hosts have failed and the running result is not ok 30529 1726882641.92041: done checking to see if all hosts have failed 30529 1726882641.92042: getting the remaining hosts for this loop 30529 1726882641.92044: done getting the remaining hosts for this loop 30529 1726882641.92047: getting the next task for host managed_node1 30529 1726882641.92057: done getting next task for host managed_node1 30529 1726882641.92061: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882641.92066: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882641.92310: getting variables 30529 1726882641.92312: in VariableManager get_vars() 30529 1726882641.92348: Calling all_inventory to load vars for managed_node1 30529 1726882641.92350: Calling groups_inventory to load vars for managed_node1 30529 1726882641.92353: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882641.92362: Calling all_plugins_play to load vars for managed_node1 30529 1726882641.92366: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882641.92369: Calling groups_plugins_play to load vars for managed_node1 30529 1726882641.93949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882641.95502: done with get_vars() 30529 1726882641.95523: done getting variables 30529 1726882641.95588: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:21 -0400 (0:00:00.055) 0:00:55.982 ****** 30529 1726882641.95627: entering _queue_task() for managed_node1/fail 30529 1726882641.95957: worker is 1 (out of 1 available) 30529 1726882641.95970: exiting _queue_task() for managed_node1/fail 30529 1726882641.96099: done queuing things up, now waiting for results queue to drain 30529 1726882641.96101: waiting for pending results... 30529 1726882641.96337: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882641.96536: in run() - task 12673a56-9f93-b0f1-edc0-00000000127e 30529 1726882641.96541: variable 'ansible_search_path' from source: unknown 30529 1726882641.96544: variable 'ansible_search_path' from source: unknown 30529 1726882641.96547: calling self._execute() 30529 1726882641.96629: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882641.96646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882641.96665: variable 'omit' from source: magic vars 30529 1726882641.97053: variable 'ansible_distribution_major_version' from source: facts 30529 1726882641.97078: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882641.97299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882641.99564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882641.99652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882641.99702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882641.99742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882641.99801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882641.99865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882641.99907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882641.99941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.00019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.00025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.00116: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.00147: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882642.00351: variable 'ansible_distribution' from source: facts 30529 1726882642.00355: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.00357: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882642.00573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.00611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.00640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.00694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.00716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.00767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.00807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.00896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.00901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.00903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.00949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.00978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.01019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.01108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.01111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.01420: variable 'network_connections' from source: include params 30529 1726882642.01442: variable 'interface' from source: play vars 30529 1726882642.01515: variable 'interface' from source: play vars 30529 1726882642.01531: variable 'network_state' from source: role '' defaults 30529 1726882642.01611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.01806: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.01848: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.01974: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.01978: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.01996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.02025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.02065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.02106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.02136: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882642.02145: when evaluation is False, skipping this task 30529 1726882642.02153: _execute() done 30529 1726882642.02160: dumping result to json 30529 1726882642.02168: done dumping result, returning 30529 1726882642.02181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-00000000127e] 30529 1726882642.02190: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127e 30529 1726882642.02382: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127e 30529 1726882642.02386: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882642.02460: no more pending results, returning what we have 30529 1726882642.02465: results queue empty 30529 1726882642.02466: checking for any_errors_fatal 30529 1726882642.02474: done checking for any_errors_fatal 30529 1726882642.02475: checking for max_fail_percentage 30529 1726882642.02477: done checking for max_fail_percentage 30529 1726882642.02478: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.02479: done checking to see if all hosts have failed 30529 1726882642.02480: getting the remaining hosts for this loop 30529 1726882642.02482: done getting the remaining hosts for this loop 30529 1726882642.02487: getting the next task for host managed_node1 30529 1726882642.02498: done getting next task for host managed_node1 30529 1726882642.02502: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882642.02507: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.02532: getting variables 30529 1726882642.02535: in VariableManager get_vars() 30529 1726882642.02575: Calling all_inventory to load vars for managed_node1 30529 1726882642.02578: Calling groups_inventory to load vars for managed_node1 30529 1726882642.02580: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.02591: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.02810: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.02815: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.04240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.05366: done with get_vars() 30529 1726882642.05382: done getting variables 30529 1726882642.05428: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:22 -0400 (0:00:00.098) 0:00:56.080 ****** 30529 1726882642.05453: entering _queue_task() for managed_node1/dnf 30529 1726882642.05708: worker is 1 (out of 1 available) 30529 1726882642.05722: exiting _queue_task() for managed_node1/dnf 30529 1726882642.05735: done queuing things up, now waiting for results queue to drain 30529 1726882642.05737: waiting for pending results... 30529 1726882642.05923: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882642.06022: in run() - task 12673a56-9f93-b0f1-edc0-00000000127f 30529 1726882642.06035: variable 'ansible_search_path' from source: unknown 30529 1726882642.06039: variable 'ansible_search_path' from source: unknown 30529 1726882642.06067: calling self._execute() 30529 1726882642.06144: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.06148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.06156: variable 'omit' from source: magic vars 30529 1726882642.06539: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.06543: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.06820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.08381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.08436: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.08464: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.08491: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.08511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.08570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.08592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.08610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.08650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.08664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.08744: variable 'ansible_distribution' from source: facts 30529 1726882642.08748: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.08760: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882642.08837: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.08919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.08935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.08953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.08977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.08994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.09021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.09037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.09053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.09076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.09086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.09119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.09135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.09151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.09174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.09184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.09291: variable 'network_connections' from source: include params 30529 1726882642.09321: variable 'interface' from source: play vars 30529 1726882642.09353: variable 'interface' from source: play vars 30529 1726882642.09483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.09719: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.09722: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.09724: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.09726: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.09728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.09747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.09773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.09800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.09844: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882642.10170: variable 'network_connections' from source: include params 30529 1726882642.10173: variable 'interface' from source: play vars 30529 1726882642.10176: variable 'interface' from source: play vars 30529 1726882642.10178: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882642.10181: when evaluation is False, skipping this task 30529 1726882642.10183: _execute() done 30529 1726882642.10185: dumping result to json 30529 1726882642.10187: done dumping result, returning 30529 1726882642.10301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000127f] 30529 1726882642.10304: sending task result for task 12673a56-9f93-b0f1-edc0-00000000127f 30529 1726882642.10367: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000127f 30529 1726882642.10370: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882642.10450: no more pending results, returning what we have 30529 1726882642.10454: results queue empty 30529 1726882642.10455: checking for any_errors_fatal 30529 1726882642.10460: done checking for any_errors_fatal 30529 1726882642.10461: checking for max_fail_percentage 30529 1726882642.10462: done checking for max_fail_percentage 30529 1726882642.10463: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.10464: done checking to see if all hosts have failed 30529 1726882642.10464: getting the remaining hosts for this loop 30529 1726882642.10466: done getting the remaining hosts for this loop 30529 1726882642.10469: getting the next task for host managed_node1 30529 1726882642.10476: done getting next task for host managed_node1 30529 1726882642.10479: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882642.10484: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.10614: getting variables 30529 1726882642.10616: in VariableManager get_vars() 30529 1726882642.10709: Calling all_inventory to load vars for managed_node1 30529 1726882642.10712: Calling groups_inventory to load vars for managed_node1 30529 1726882642.10715: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.10723: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.10725: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.10728: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.12434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.14297: done with get_vars() 30529 1726882642.14323: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882642.14404: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:22 -0400 (0:00:00.089) 0:00:56.170 ****** 30529 1726882642.14437: entering _queue_task() for managed_node1/yum 30529 1726882642.15180: worker is 1 (out of 1 available) 30529 1726882642.15199: exiting _queue_task() for managed_node1/yum 30529 1726882642.15213: done queuing things up, now waiting for results queue to drain 30529 1726882642.15215: waiting for pending results... 30529 1726882642.15702: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882642.15805: in run() - task 12673a56-9f93-b0f1-edc0-000000001280 30529 1726882642.15904: variable 'ansible_search_path' from source: unknown 30529 1726882642.15909: variable 'ansible_search_path' from source: unknown 30529 1726882642.15913: calling self._execute() 30529 1726882642.16010: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.16014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.16018: variable 'omit' from source: magic vars 30529 1726882642.16356: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.16374: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.16573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.20471: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.20690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.20808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.20845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.20908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.21065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.21094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.21198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.21201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.21217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.21523: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.21539: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882642.21543: when evaluation is False, skipping this task 30529 1726882642.21545: _execute() done 30529 1726882642.21548: dumping result to json 30529 1726882642.21550: done dumping result, returning 30529 1726882642.21563: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001280] 30529 1726882642.21566: sending task result for task 12673a56-9f93-b0f1-edc0-000000001280 30529 1726882642.21673: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001280 30529 1726882642.21677: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882642.21735: no more pending results, returning what we have 30529 1726882642.21739: results queue empty 30529 1726882642.21740: checking for any_errors_fatal 30529 1726882642.21750: done checking for any_errors_fatal 30529 1726882642.21750: checking for max_fail_percentage 30529 1726882642.21752: done checking for max_fail_percentage 30529 1726882642.21753: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.21754: done checking to see if all hosts have failed 30529 1726882642.21755: getting the remaining hosts for this loop 30529 1726882642.21757: done getting the remaining hosts for this loop 30529 1726882642.21761: getting the next task for host managed_node1 30529 1726882642.21770: done getting next task for host managed_node1 30529 1726882642.21773: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882642.21778: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.21809: getting variables 30529 1726882642.21812: in VariableManager get_vars() 30529 1726882642.21853: Calling all_inventory to load vars for managed_node1 30529 1726882642.21856: Calling groups_inventory to load vars for managed_node1 30529 1726882642.21859: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.21869: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.21873: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.21876: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.25322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.38956: done with get_vars() 30529 1726882642.38984: done getting variables 30529 1726882642.39143: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:22 -0400 (0:00:00.247) 0:00:56.417 ****** 30529 1726882642.39174: entering _queue_task() for managed_node1/fail 30529 1726882642.40008: worker is 1 (out of 1 available) 30529 1726882642.40023: exiting _queue_task() for managed_node1/fail 30529 1726882642.40036: done queuing things up, now waiting for results queue to drain 30529 1726882642.40038: waiting for pending results... 30529 1726882642.40619: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882642.40777: in run() - task 12673a56-9f93-b0f1-edc0-000000001281 30529 1726882642.40789: variable 'ansible_search_path' from source: unknown 30529 1726882642.40819: variable 'ansible_search_path' from source: unknown 30529 1726882642.40833: calling self._execute() 30529 1726882642.40929: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.40940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.40943: variable 'omit' from source: magic vars 30529 1726882642.42099: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.42103: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.42162: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.42584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.46280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.46428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.46464: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.46502: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.46536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.46799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.46804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.46807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.46810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.46813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.46815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.46845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.47010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.47013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.47016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.47018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.47021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.47023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.47033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.47049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.47460: variable 'network_connections' from source: include params 30529 1726882642.47511: variable 'interface' from source: play vars 30529 1726882642.47585: variable 'interface' from source: play vars 30529 1726882642.47825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.48077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.48427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.48498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.48698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.48701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.48703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.48706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.48708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.48710: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882642.49002: variable 'network_connections' from source: include params 30529 1726882642.49011: variable 'interface' from source: play vars 30529 1726882642.49073: variable 'interface' from source: play vars 30529 1726882642.49105: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882642.49114: when evaluation is False, skipping this task 30529 1726882642.49121: _execute() done 30529 1726882642.49128: dumping result to json 30529 1726882642.49135: done dumping result, returning 30529 1726882642.49146: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001281] 30529 1726882642.49155: sending task result for task 12673a56-9f93-b0f1-edc0-000000001281 30529 1726882642.49279: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001281 30529 1726882642.49288: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882642.49343: no more pending results, returning what we have 30529 1726882642.49347: results queue empty 30529 1726882642.49348: checking for any_errors_fatal 30529 1726882642.49356: done checking for any_errors_fatal 30529 1726882642.49357: checking for max_fail_percentage 30529 1726882642.49358: done checking for max_fail_percentage 30529 1726882642.49359: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.49360: done checking to see if all hosts have failed 30529 1726882642.49361: getting the remaining hosts for this loop 30529 1726882642.49362: done getting the remaining hosts for this loop 30529 1726882642.49366: getting the next task for host managed_node1 30529 1726882642.49374: done getting next task for host managed_node1 30529 1726882642.49377: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882642.49382: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.49524: getting variables 30529 1726882642.49526: in VariableManager get_vars() 30529 1726882642.49571: Calling all_inventory to load vars for managed_node1 30529 1726882642.49574: Calling groups_inventory to load vars for managed_node1 30529 1726882642.49577: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.49587: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.49591: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.49816: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.51037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.51913: done with get_vars() 30529 1726882642.51931: done getting variables 30529 1726882642.51970: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:22 -0400 (0:00:00.128) 0:00:56.546 ****** 30529 1726882642.52001: entering _queue_task() for managed_node1/package 30529 1726882642.52229: worker is 1 (out of 1 available) 30529 1726882642.52242: exiting _queue_task() for managed_node1/package 30529 1726882642.52260: done queuing things up, now waiting for results queue to drain 30529 1726882642.52261: waiting for pending results... 30529 1726882642.52495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882642.52667: in run() - task 12673a56-9f93-b0f1-edc0-000000001282 30529 1726882642.52686: variable 'ansible_search_path' from source: unknown 30529 1726882642.52702: variable 'ansible_search_path' from source: unknown 30529 1726882642.52748: calling self._execute() 30529 1726882642.52862: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.52873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.52887: variable 'omit' from source: magic vars 30529 1726882642.53279: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.53290: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.53433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.53627: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.53660: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.53684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.53747: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.53830: variable 'network_packages' from source: role '' defaults 30529 1726882642.53902: variable '__network_provider_setup' from source: role '' defaults 30529 1726882642.53910: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882642.53957: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882642.53964: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882642.54010: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882642.54126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.55713: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.55752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.55781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.55808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.55827: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.55885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.55909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.55927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.55952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.55962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.56002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.56018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.56035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.56058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.56068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.56209: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882642.56285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.56306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.56325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.56349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.56359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.56426: variable 'ansible_python' from source: facts 30529 1726882642.56435: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882642.56488: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882642.56546: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882642.56629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.56648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.56665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.56689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.56704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.56735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.56758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.56774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.56802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.56813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.56909: variable 'network_connections' from source: include params 30529 1726882642.56915: variable 'interface' from source: play vars 30529 1726882642.56983: variable 'interface' from source: play vars 30529 1726882642.57035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.57054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.57075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.57102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.57138: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.57313: variable 'network_connections' from source: include params 30529 1726882642.57316: variable 'interface' from source: play vars 30529 1726882642.57382: variable 'interface' from source: play vars 30529 1726882642.57412: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882642.57462: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.57663: variable 'network_connections' from source: include params 30529 1726882642.57666: variable 'interface' from source: play vars 30529 1726882642.57714: variable 'interface' from source: play vars 30529 1726882642.57731: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882642.57783: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882642.57974: variable 'network_connections' from source: include params 30529 1726882642.57977: variable 'interface' from source: play vars 30529 1726882642.58027: variable 'interface' from source: play vars 30529 1726882642.58061: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882642.58107: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882642.58113: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882642.58153: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882642.58291: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882642.58575: variable 'network_connections' from source: include params 30529 1726882642.58578: variable 'interface' from source: play vars 30529 1726882642.58626: variable 'interface' from source: play vars 30529 1726882642.58632: variable 'ansible_distribution' from source: facts 30529 1726882642.58635: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.58640: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.58651: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882642.58756: variable 'ansible_distribution' from source: facts 30529 1726882642.58760: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.58762: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.58774: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882642.58882: variable 'ansible_distribution' from source: facts 30529 1726882642.58885: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.58890: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.58917: variable 'network_provider' from source: set_fact 30529 1726882642.58929: variable 'ansible_facts' from source: unknown 30529 1726882642.59278: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882642.59283: when evaluation is False, skipping this task 30529 1726882642.59285: _execute() done 30529 1726882642.59288: dumping result to json 30529 1726882642.59290: done dumping result, returning 30529 1726882642.59300: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000001282] 30529 1726882642.59302: sending task result for task 12673a56-9f93-b0f1-edc0-000000001282 30529 1726882642.59394: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001282 30529 1726882642.59398: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882642.59446: no more pending results, returning what we have 30529 1726882642.59450: results queue empty 30529 1726882642.59451: checking for any_errors_fatal 30529 1726882642.59456: done checking for any_errors_fatal 30529 1726882642.59457: checking for max_fail_percentage 30529 1726882642.59459: done checking for max_fail_percentage 30529 1726882642.59460: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.59461: done checking to see if all hosts have failed 30529 1726882642.59461: getting the remaining hosts for this loop 30529 1726882642.59463: done getting the remaining hosts for this loop 30529 1726882642.59467: getting the next task for host managed_node1 30529 1726882642.59475: done getting next task for host managed_node1 30529 1726882642.59478: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882642.59483: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.59510: getting variables 30529 1726882642.59512: in VariableManager get_vars() 30529 1726882642.59554: Calling all_inventory to load vars for managed_node1 30529 1726882642.59556: Calling groups_inventory to load vars for managed_node1 30529 1726882642.59558: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.59567: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.59570: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.59572: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.60511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.61372: done with get_vars() 30529 1726882642.61388: done getting variables 30529 1726882642.61430: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:22 -0400 (0:00:00.094) 0:00:56.640 ****** 30529 1726882642.61457: entering _queue_task() for managed_node1/package 30529 1726882642.61690: worker is 1 (out of 1 available) 30529 1726882642.61706: exiting _queue_task() for managed_node1/package 30529 1726882642.61721: done queuing things up, now waiting for results queue to drain 30529 1726882642.61722: waiting for pending results... 30529 1726882642.61895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882642.61998: in run() - task 12673a56-9f93-b0f1-edc0-000000001283 30529 1726882642.62012: variable 'ansible_search_path' from source: unknown 30529 1726882642.62017: variable 'ansible_search_path' from source: unknown 30529 1726882642.62044: calling self._execute() 30529 1726882642.62121: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.62124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.62133: variable 'omit' from source: magic vars 30529 1726882642.62406: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.62415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.62501: variable 'network_state' from source: role '' defaults 30529 1726882642.62507: Evaluated conditional (network_state != {}): False 30529 1726882642.62510: when evaluation is False, skipping this task 30529 1726882642.62513: _execute() done 30529 1726882642.62515: dumping result to json 30529 1726882642.62518: done dumping result, returning 30529 1726882642.62525: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001283] 30529 1726882642.62530: sending task result for task 12673a56-9f93-b0f1-edc0-000000001283 30529 1726882642.62618: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001283 30529 1726882642.62621: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882642.62666: no more pending results, returning what we have 30529 1726882642.62670: results queue empty 30529 1726882642.62671: checking for any_errors_fatal 30529 1726882642.62678: done checking for any_errors_fatal 30529 1726882642.62678: checking for max_fail_percentage 30529 1726882642.62680: done checking for max_fail_percentage 30529 1726882642.62681: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.62681: done checking to see if all hosts have failed 30529 1726882642.62682: getting the remaining hosts for this loop 30529 1726882642.62684: done getting the remaining hosts for this loop 30529 1726882642.62687: getting the next task for host managed_node1 30529 1726882642.62696: done getting next task for host managed_node1 30529 1726882642.62699: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882642.62704: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.62724: getting variables 30529 1726882642.62726: in VariableManager get_vars() 30529 1726882642.62759: Calling all_inventory to load vars for managed_node1 30529 1726882642.62761: Calling groups_inventory to load vars for managed_node1 30529 1726882642.62763: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.62772: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.62775: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.62777: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.63509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.64379: done with get_vars() 30529 1726882642.64396: done getting variables 30529 1726882642.64439: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:22 -0400 (0:00:00.030) 0:00:56.670 ****** 30529 1726882642.64462: entering _queue_task() for managed_node1/package 30529 1726882642.64664: worker is 1 (out of 1 available) 30529 1726882642.64678: exiting _queue_task() for managed_node1/package 30529 1726882642.64691: done queuing things up, now waiting for results queue to drain 30529 1726882642.64692: waiting for pending results... 30529 1726882642.64862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882642.64946: in run() - task 12673a56-9f93-b0f1-edc0-000000001284 30529 1726882642.64959: variable 'ansible_search_path' from source: unknown 30529 1726882642.64963: variable 'ansible_search_path' from source: unknown 30529 1726882642.64989: calling self._execute() 30529 1726882642.65066: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.65069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.65078: variable 'omit' from source: magic vars 30529 1726882642.65342: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.65357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.65436: variable 'network_state' from source: role '' defaults 30529 1726882642.65445: Evaluated conditional (network_state != {}): False 30529 1726882642.65448: when evaluation is False, skipping this task 30529 1726882642.65451: _execute() done 30529 1726882642.65453: dumping result to json 30529 1726882642.65456: done dumping result, returning 30529 1726882642.65463: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001284] 30529 1726882642.65475: sending task result for task 12673a56-9f93-b0f1-edc0-000000001284 30529 1726882642.65564: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001284 30529 1726882642.65567: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882642.65614: no more pending results, returning what we have 30529 1726882642.65618: results queue empty 30529 1726882642.65619: checking for any_errors_fatal 30529 1726882642.65624: done checking for any_errors_fatal 30529 1726882642.65625: checking for max_fail_percentage 30529 1726882642.65627: done checking for max_fail_percentage 30529 1726882642.65628: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.65629: done checking to see if all hosts have failed 30529 1726882642.65629: getting the remaining hosts for this loop 30529 1726882642.65631: done getting the remaining hosts for this loop 30529 1726882642.65634: getting the next task for host managed_node1 30529 1726882642.65641: done getting next task for host managed_node1 30529 1726882642.65644: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882642.65648: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.65667: getting variables 30529 1726882642.65668: in VariableManager get_vars() 30529 1726882642.65701: Calling all_inventory to load vars for managed_node1 30529 1726882642.65704: Calling groups_inventory to load vars for managed_node1 30529 1726882642.65706: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.65714: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.65716: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.65719: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.66559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.67770: done with get_vars() 30529 1726882642.67792: done getting variables 30529 1726882642.67845: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:22 -0400 (0:00:00.034) 0:00:56.704 ****** 30529 1726882642.67880: entering _queue_task() for managed_node1/service 30529 1726882642.68085: worker is 1 (out of 1 available) 30529 1726882642.68104: exiting _queue_task() for managed_node1/service 30529 1726882642.68118: done queuing things up, now waiting for results queue to drain 30529 1726882642.68119: waiting for pending results... 30529 1726882642.68301: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882642.68402: in run() - task 12673a56-9f93-b0f1-edc0-000000001285 30529 1726882642.68414: variable 'ansible_search_path' from source: unknown 30529 1726882642.68418: variable 'ansible_search_path' from source: unknown 30529 1726882642.68444: calling self._execute() 30529 1726882642.68515: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.68518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.68526: variable 'omit' from source: magic vars 30529 1726882642.68780: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.68787: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.68871: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.69001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.70897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.70901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.70929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.70963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.70996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.71082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.71127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.71153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.71196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.71218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.71340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.71369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.71430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.71456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.71476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.71524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.71563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.71592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.71650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.71671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.71875: variable 'network_connections' from source: include params 30529 1726882642.71895: variable 'interface' from source: play vars 30529 1726882642.71979: variable 'interface' from source: play vars 30529 1726882642.72061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.72253: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.72399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.72410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.72414: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.72453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.72481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.72630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.72635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.72638: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882642.72897: variable 'network_connections' from source: include params 30529 1726882642.72908: variable 'interface' from source: play vars 30529 1726882642.72975: variable 'interface' from source: play vars 30529 1726882642.73006: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882642.73014: when evaluation is False, skipping this task 30529 1726882642.73021: _execute() done 30529 1726882642.73026: dumping result to json 30529 1726882642.73033: done dumping result, returning 30529 1726882642.73043: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001285] 30529 1726882642.73068: sending task result for task 12673a56-9f93-b0f1-edc0-000000001285 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882642.73251: no more pending results, returning what we have 30529 1726882642.73254: results queue empty 30529 1726882642.73256: checking for any_errors_fatal 30529 1726882642.73262: done checking for any_errors_fatal 30529 1726882642.73263: checking for max_fail_percentage 30529 1726882642.73264: done checking for max_fail_percentage 30529 1726882642.73265: checking to see if all hosts have failed and the running result is not ok 30529 1726882642.73266: done checking to see if all hosts have failed 30529 1726882642.73267: getting the remaining hosts for this loop 30529 1726882642.73269: done getting the remaining hosts for this loop 30529 1726882642.73272: getting the next task for host managed_node1 30529 1726882642.73288: done getting next task for host managed_node1 30529 1726882642.73390: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882642.73398: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882642.73423: getting variables 30529 1726882642.73425: in VariableManager get_vars() 30529 1726882642.73465: Calling all_inventory to load vars for managed_node1 30529 1726882642.73468: Calling groups_inventory to load vars for managed_node1 30529 1726882642.73470: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882642.73481: Calling all_plugins_play to load vars for managed_node1 30529 1726882642.73484: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882642.73487: Calling groups_plugins_play to load vars for managed_node1 30529 1726882642.74252: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001285 30529 1726882642.74256: WORKER PROCESS EXITING 30529 1726882642.75186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882642.76305: done with get_vars() 30529 1726882642.76321: done getting variables 30529 1726882642.76365: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:22 -0400 (0:00:00.085) 0:00:56.790 ****** 30529 1726882642.76391: entering _queue_task() for managed_node1/service 30529 1726882642.76637: worker is 1 (out of 1 available) 30529 1726882642.76652: exiting _queue_task() for managed_node1/service 30529 1726882642.76664: done queuing things up, now waiting for results queue to drain 30529 1726882642.76665: waiting for pending results... 30529 1726882642.76845: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882642.76940: in run() - task 12673a56-9f93-b0f1-edc0-000000001286 30529 1726882642.76953: variable 'ansible_search_path' from source: unknown 30529 1726882642.76957: variable 'ansible_search_path' from source: unknown 30529 1726882642.76984: calling self._execute() 30529 1726882642.77072: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.77075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.77084: variable 'omit' from source: magic vars 30529 1726882642.77502: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.77505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882642.77670: variable 'network_provider' from source: set_fact 30529 1726882642.77675: variable 'network_state' from source: role '' defaults 30529 1726882642.77684: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882642.77695: variable 'omit' from source: magic vars 30529 1726882642.77756: variable 'omit' from source: magic vars 30529 1726882642.77783: variable 'network_service_name' from source: role '' defaults 30529 1726882642.77847: variable 'network_service_name' from source: role '' defaults 30529 1726882642.77957: variable '__network_provider_setup' from source: role '' defaults 30529 1726882642.77973: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882642.78040: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882642.78048: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882642.78114: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882642.78268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882642.79704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882642.79752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882642.79780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882642.79810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882642.79832: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882642.79886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.79910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.79930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.79971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.79981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.80015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.80060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.80073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.80116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.80125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.80330: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882642.80433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.80456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.80480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.80519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.80533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.80615: variable 'ansible_python' from source: facts 30529 1726882642.80630: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882642.80702: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882642.80775: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882642.80898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.80952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.80955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.80978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.80996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.81061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882642.81078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882642.81081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.81299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882642.81302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882642.81305: variable 'network_connections' from source: include params 30529 1726882642.81307: variable 'interface' from source: play vars 30529 1726882642.81345: variable 'interface' from source: play vars 30529 1726882642.81444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882642.81627: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882642.81675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882642.81720: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882642.81761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882642.81822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882642.81860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882642.81884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882642.81971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882642.81975: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.82321: variable 'network_connections' from source: include params 30529 1726882642.82324: variable 'interface' from source: play vars 30529 1726882642.82327: variable 'interface' from source: play vars 30529 1726882642.82342: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882642.82418: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882642.82692: variable 'network_connections' from source: include params 30529 1726882642.82697: variable 'interface' from source: play vars 30529 1726882642.82761: variable 'interface' from source: play vars 30529 1726882642.82782: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882642.82856: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882642.83134: variable 'network_connections' from source: include params 30529 1726882642.83138: variable 'interface' from source: play vars 30529 1726882642.83226: variable 'interface' from source: play vars 30529 1726882642.83255: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882642.83336: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882642.83339: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882642.83376: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882642.83574: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882642.84044: variable 'network_connections' from source: include params 30529 1726882642.84048: variable 'interface' from source: play vars 30529 1726882642.84125: variable 'interface' from source: play vars 30529 1726882642.84128: variable 'ansible_distribution' from source: facts 30529 1726882642.84131: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.84133: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.84192: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882642.84318: variable 'ansible_distribution' from source: facts 30529 1726882642.84321: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.84323: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.84325: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882642.84603: variable 'ansible_distribution' from source: facts 30529 1726882642.84607: variable '__network_rh_distros' from source: role '' defaults 30529 1726882642.84609: variable 'ansible_distribution_major_version' from source: facts 30529 1726882642.84612: variable 'network_provider' from source: set_fact 30529 1726882642.84614: variable 'omit' from source: magic vars 30529 1726882642.84616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882642.84618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882642.84620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882642.84626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882642.84635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882642.84662: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882642.84665: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.84669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.84765: Set connection var ansible_shell_executable to /bin/sh 30529 1726882642.84771: Set connection var ansible_pipelining to False 30529 1726882642.84773: Set connection var ansible_shell_type to sh 30529 1726882642.84783: Set connection var ansible_timeout to 10 30529 1726882642.84786: Set connection var ansible_connection to ssh 30529 1726882642.84795: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882642.84818: variable 'ansible_shell_executable' from source: unknown 30529 1726882642.84821: variable 'ansible_connection' from source: unknown 30529 1726882642.84824: variable 'ansible_module_compression' from source: unknown 30529 1726882642.84826: variable 'ansible_shell_type' from source: unknown 30529 1726882642.84828: variable 'ansible_shell_executable' from source: unknown 30529 1726882642.84830: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882642.84833: variable 'ansible_pipelining' from source: unknown 30529 1726882642.84835: variable 'ansible_timeout' from source: unknown 30529 1726882642.84837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882642.84954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882642.84961: variable 'omit' from source: magic vars 30529 1726882642.84964: starting attempt loop 30529 1726882642.84966: running the handler 30529 1726882642.85062: variable 'ansible_facts' from source: unknown 30529 1726882642.86354: _low_level_execute_command(): starting 30529 1726882642.86364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882642.87038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882642.87043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882642.87098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882642.87101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882642.87105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882642.87161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882642.87187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882642.87265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882642.89398: stdout chunk (state=3): >>>/root <<< 30529 1726882642.89402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882642.89404: stdout chunk (state=3): >>><<< 30529 1726882642.89406: stderr chunk (state=3): >>><<< 30529 1726882642.89410: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882642.89413: _low_level_execute_command(): starting 30529 1726882642.89415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588 `" && echo ansible-tmp-1726882642.8928504-33280-197501156994588="` echo /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588 `" ) && sleep 0' 30529 1726882642.90476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882642.90480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882642.90500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882642.90506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882642.90522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882642.90527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882642.90761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882642.90764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882642.90832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882642.92716: stdout chunk (state=3): >>>ansible-tmp-1726882642.8928504-33280-197501156994588=/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588 <<< 30529 1726882642.93065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882642.93078: stderr chunk (state=3): >>><<< 30529 1726882642.93081: stdout chunk (state=3): >>><<< 30529 1726882642.93112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882642.8928504-33280-197501156994588=/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882642.93143: variable 'ansible_module_compression' from source: unknown 30529 1726882642.93309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882642.93371: variable 'ansible_facts' from source: unknown 30529 1726882642.94003: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py 30529 1726882642.94066: Sending initial data 30529 1726882642.94069: Sent initial data (156 bytes) 30529 1726882642.94659: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882642.94669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882642.94679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882642.94697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882642.94715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882642.94723: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882642.94733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882642.94747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882642.94870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882642.94874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882642.94876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882642.94878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882642.94980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882642.96508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882642.96546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882642.96597: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp31kwrzh_ /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py <<< 30529 1726882642.96601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py" <<< 30529 1726882642.97002: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp31kwrzh_" to remote "/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py" <<< 30529 1726882642.98222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882642.98226: stdout chunk (state=3): >>><<< 30529 1726882642.98235: stderr chunk (state=3): >>><<< 30529 1726882642.98287: done transferring module to remote 30529 1726882642.98312: _low_level_execute_command(): starting 30529 1726882642.98319: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/ /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py && sleep 0' 30529 1726882642.99032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882642.99065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882642.99084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882642.99276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882642.99426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882642.99480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.01217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.01220: stdout chunk (state=3): >>><<< 30529 1726882643.01227: stderr chunk (state=3): >>><<< 30529 1726882643.01241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882643.01245: _low_level_execute_command(): starting 30529 1726882643.01249: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/AnsiballZ_systemd.py && sleep 0' 30529 1726882643.01911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882643.01925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882643.01937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.01953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882643.01969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882643.01983: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882643.02003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.02022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882643.02033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882643.02043: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882643.02058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882643.02107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.02154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.02169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882643.02195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.02275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.30903: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10895360", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300040704", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1810867000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882643.30941: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882643.32766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882643.32775: stdout chunk (state=3): >>><<< 30529 1726882643.32787: stderr chunk (state=3): >>><<< 30529 1726882643.32817: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10895360", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300040704", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1810867000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882643.33147: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882643.33151: _low_level_execute_command(): starting 30529 1726882643.33154: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882642.8928504-33280-197501156994588/ > /dev/null 2>&1 && sleep 0' 30529 1726882643.33745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882643.33759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882643.33774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.33797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882643.33816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882643.33831: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882643.33908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.33954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.33971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882643.33995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.34066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.35897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.35900: stdout chunk (state=3): >>><<< 30529 1726882643.35903: stderr chunk (state=3): >>><<< 30529 1726882643.35906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882643.35913: handler run complete 30529 1726882643.35955: attempt loop complete, returning result 30529 1726882643.35958: _execute() done 30529 1726882643.35961: dumping result to json 30529 1726882643.35972: done dumping result, returning 30529 1726882643.35981: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000001286] 30529 1726882643.35983: sending task result for task 12673a56-9f93-b0f1-edc0-000000001286 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882643.36292: no more pending results, returning what we have 30529 1726882643.36297: results queue empty 30529 1726882643.36299: checking for any_errors_fatal 30529 1726882643.36310: done checking for any_errors_fatal 30529 1726882643.36311: checking for max_fail_percentage 30529 1726882643.36312: done checking for max_fail_percentage 30529 1726882643.36313: checking to see if all hosts have failed and the running result is not ok 30529 1726882643.36314: done checking to see if all hosts have failed 30529 1726882643.36315: getting the remaining hosts for this loop 30529 1726882643.36316: done getting the remaining hosts for this loop 30529 1726882643.36320: getting the next task for host managed_node1 30529 1726882643.36327: done getting next task for host managed_node1 30529 1726882643.36330: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882643.36335: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882643.36347: getting variables 30529 1726882643.36349: in VariableManager get_vars() 30529 1726882643.36381: Calling all_inventory to load vars for managed_node1 30529 1726882643.36384: Calling groups_inventory to load vars for managed_node1 30529 1726882643.36386: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882643.36399: Calling all_plugins_play to load vars for managed_node1 30529 1726882643.36402: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882643.36407: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001286 30529 1726882643.36411: WORKER PROCESS EXITING 30529 1726882643.36419: Calling groups_plugins_play to load vars for managed_node1 30529 1726882643.37211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882643.38607: done with get_vars() 30529 1726882643.38627: done getting variables 30529 1726882643.38685: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:23 -0400 (0:00:00.623) 0:00:57.413 ****** 30529 1726882643.38726: entering _queue_task() for managed_node1/service 30529 1726882643.39034: worker is 1 (out of 1 available) 30529 1726882643.39046: exiting _queue_task() for managed_node1/service 30529 1726882643.39057: done queuing things up, now waiting for results queue to drain 30529 1726882643.39059: waiting for pending results... 30529 1726882643.39469: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882643.39527: in run() - task 12673a56-9f93-b0f1-edc0-000000001287 30529 1726882643.39531: variable 'ansible_search_path' from source: unknown 30529 1726882643.39535: variable 'ansible_search_path' from source: unknown 30529 1726882643.39546: calling self._execute() 30529 1726882643.39624: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.39627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.39643: variable 'omit' from source: magic vars 30529 1726882643.39919: variable 'ansible_distribution_major_version' from source: facts 30529 1726882643.39928: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882643.40011: variable 'network_provider' from source: set_fact 30529 1726882643.40015: Evaluated conditional (network_provider == "nm"): True 30529 1726882643.40083: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882643.40153: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882643.40267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882643.42203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882643.42243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882643.42273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882643.42300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882643.42322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882643.42383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882643.42414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882643.42432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882643.42458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882643.42468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882643.42508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882643.42526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882643.42543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882643.42567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882643.42578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882643.42611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882643.42627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882643.42645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882643.42669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882643.42679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882643.42775: variable 'network_connections' from source: include params 30529 1726882643.42784: variable 'interface' from source: play vars 30529 1726882643.42835: variable 'interface' from source: play vars 30529 1726882643.42882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882643.43007: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882643.43036: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882643.43059: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882643.43081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882643.43115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882643.43130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882643.43150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882643.43167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882643.43207: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882643.43359: variable 'network_connections' from source: include params 30529 1726882643.43365: variable 'interface' from source: play vars 30529 1726882643.43408: variable 'interface' from source: play vars 30529 1726882643.43429: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882643.43432: when evaluation is False, skipping this task 30529 1726882643.43435: _execute() done 30529 1726882643.43437: dumping result to json 30529 1726882643.43439: done dumping result, returning 30529 1726882643.43446: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000001287] 30529 1726882643.43456: sending task result for task 12673a56-9f93-b0f1-edc0-000000001287 30529 1726882643.43537: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001287 30529 1726882643.43540: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882643.43584: no more pending results, returning what we have 30529 1726882643.43589: results queue empty 30529 1726882643.43590: checking for any_errors_fatal 30529 1726882643.43611: done checking for any_errors_fatal 30529 1726882643.43612: checking for max_fail_percentage 30529 1726882643.43614: done checking for max_fail_percentage 30529 1726882643.43615: checking to see if all hosts have failed and the running result is not ok 30529 1726882643.43616: done checking to see if all hosts have failed 30529 1726882643.43617: getting the remaining hosts for this loop 30529 1726882643.43619: done getting the remaining hosts for this loop 30529 1726882643.43622: getting the next task for host managed_node1 30529 1726882643.43630: done getting next task for host managed_node1 30529 1726882643.43634: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882643.43638: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882643.43659: getting variables 30529 1726882643.43661: in VariableManager get_vars() 30529 1726882643.43703: Calling all_inventory to load vars for managed_node1 30529 1726882643.43706: Calling groups_inventory to load vars for managed_node1 30529 1726882643.43708: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882643.43717: Calling all_plugins_play to load vars for managed_node1 30529 1726882643.43720: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882643.43722: Calling groups_plugins_play to load vars for managed_node1 30529 1726882643.44624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882643.45480: done with get_vars() 30529 1726882643.45497: done getting variables 30529 1726882643.45540: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:23 -0400 (0:00:00.068) 0:00:57.481 ****** 30529 1726882643.45564: entering _queue_task() for managed_node1/service 30529 1726882643.45799: worker is 1 (out of 1 available) 30529 1726882643.45814: exiting _queue_task() for managed_node1/service 30529 1726882643.45830: done queuing things up, now waiting for results queue to drain 30529 1726882643.45832: waiting for pending results... 30529 1726882643.46008: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882643.46107: in run() - task 12673a56-9f93-b0f1-edc0-000000001288 30529 1726882643.46118: variable 'ansible_search_path' from source: unknown 30529 1726882643.46122: variable 'ansible_search_path' from source: unknown 30529 1726882643.46149: calling self._execute() 30529 1726882643.46227: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.46231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.46240: variable 'omit' from source: magic vars 30529 1726882643.46517: variable 'ansible_distribution_major_version' from source: facts 30529 1726882643.46527: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882643.46609: variable 'network_provider' from source: set_fact 30529 1726882643.46614: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882643.46617: when evaluation is False, skipping this task 30529 1726882643.46619: _execute() done 30529 1726882643.46622: dumping result to json 30529 1726882643.46624: done dumping result, returning 30529 1726882643.46632: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000001288] 30529 1726882643.46636: sending task result for task 12673a56-9f93-b0f1-edc0-000000001288 30529 1726882643.46724: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001288 30529 1726882643.46728: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882643.46769: no more pending results, returning what we have 30529 1726882643.46772: results queue empty 30529 1726882643.46773: checking for any_errors_fatal 30529 1726882643.46783: done checking for any_errors_fatal 30529 1726882643.46783: checking for max_fail_percentage 30529 1726882643.46785: done checking for max_fail_percentage 30529 1726882643.46786: checking to see if all hosts have failed and the running result is not ok 30529 1726882643.46786: done checking to see if all hosts have failed 30529 1726882643.46787: getting the remaining hosts for this loop 30529 1726882643.46789: done getting the remaining hosts for this loop 30529 1726882643.46792: getting the next task for host managed_node1 30529 1726882643.46802: done getting next task for host managed_node1 30529 1726882643.46806: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882643.46811: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882643.46835: getting variables 30529 1726882643.46836: in VariableManager get_vars() 30529 1726882643.46869: Calling all_inventory to load vars for managed_node1 30529 1726882643.46871: Calling groups_inventory to load vars for managed_node1 30529 1726882643.46873: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882643.46884: Calling all_plugins_play to load vars for managed_node1 30529 1726882643.46886: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882643.46889: Calling groups_plugins_play to load vars for managed_node1 30529 1726882643.47651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882643.48516: done with get_vars() 30529 1726882643.48532: done getting variables 30529 1726882643.48573: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:23 -0400 (0:00:00.030) 0:00:57.512 ****** 30529 1726882643.48601: entering _queue_task() for managed_node1/copy 30529 1726882643.48838: worker is 1 (out of 1 available) 30529 1726882643.48853: exiting _queue_task() for managed_node1/copy 30529 1726882643.48866: done queuing things up, now waiting for results queue to drain 30529 1726882643.48867: waiting for pending results... 30529 1726882643.49051: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882643.49149: in run() - task 12673a56-9f93-b0f1-edc0-000000001289 30529 1726882643.49161: variable 'ansible_search_path' from source: unknown 30529 1726882643.49164: variable 'ansible_search_path' from source: unknown 30529 1726882643.49191: calling self._execute() 30529 1726882643.49268: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.49271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.49280: variable 'omit' from source: magic vars 30529 1726882643.49553: variable 'ansible_distribution_major_version' from source: facts 30529 1726882643.49563: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882643.49646: variable 'network_provider' from source: set_fact 30529 1726882643.49650: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882643.49653: when evaluation is False, skipping this task 30529 1726882643.49656: _execute() done 30529 1726882643.49658: dumping result to json 30529 1726882643.49661: done dumping result, returning 30529 1726882643.49670: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000001289] 30529 1726882643.49673: sending task result for task 12673a56-9f93-b0f1-edc0-000000001289 30529 1726882643.49768: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001289 30529 1726882643.49770: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882643.49819: no more pending results, returning what we have 30529 1726882643.49823: results queue empty 30529 1726882643.49824: checking for any_errors_fatal 30529 1726882643.49829: done checking for any_errors_fatal 30529 1726882643.49830: checking for max_fail_percentage 30529 1726882643.49832: done checking for max_fail_percentage 30529 1726882643.49832: checking to see if all hosts have failed and the running result is not ok 30529 1726882643.49833: done checking to see if all hosts have failed 30529 1726882643.49834: getting the remaining hosts for this loop 30529 1726882643.49835: done getting the remaining hosts for this loop 30529 1726882643.49839: getting the next task for host managed_node1 30529 1726882643.49846: done getting next task for host managed_node1 30529 1726882643.49850: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882643.49854: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882643.49874: getting variables 30529 1726882643.49876: in VariableManager get_vars() 30529 1726882643.49910: Calling all_inventory to load vars for managed_node1 30529 1726882643.49912: Calling groups_inventory to load vars for managed_node1 30529 1726882643.49914: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882643.49923: Calling all_plugins_play to load vars for managed_node1 30529 1726882643.49926: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882643.49928: Calling groups_plugins_play to load vars for managed_node1 30529 1726882643.51224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882643.52730: done with get_vars() 30529 1726882643.52753: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:23 -0400 (0:00:00.042) 0:00:57.554 ****** 30529 1726882643.52848: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882643.53115: worker is 1 (out of 1 available) 30529 1726882643.53129: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882643.53144: done queuing things up, now waiting for results queue to drain 30529 1726882643.53146: waiting for pending results... 30529 1726882643.53520: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882643.53583: in run() - task 12673a56-9f93-b0f1-edc0-00000000128a 30529 1726882643.53617: variable 'ansible_search_path' from source: unknown 30529 1726882643.53620: variable 'ansible_search_path' from source: unknown 30529 1726882643.53725: calling self._execute() 30529 1726882643.53786: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.53802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.53819: variable 'omit' from source: magic vars 30529 1726882643.54219: variable 'ansible_distribution_major_version' from source: facts 30529 1726882643.54239: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882643.54252: variable 'omit' from source: magic vars 30529 1726882643.54330: variable 'omit' from source: magic vars 30529 1726882643.54508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882643.56079: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882643.56128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882643.56154: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882643.56181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882643.56203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882643.56262: variable 'network_provider' from source: set_fact 30529 1726882643.56353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882643.56372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882643.56391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882643.56426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882643.56440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882643.56491: variable 'omit' from source: magic vars 30529 1726882643.56567: variable 'omit' from source: magic vars 30529 1726882643.56644: variable 'network_connections' from source: include params 30529 1726882643.56657: variable 'interface' from source: play vars 30529 1726882643.56704: variable 'interface' from source: play vars 30529 1726882643.56806: variable 'omit' from source: magic vars 30529 1726882643.56813: variable '__lsr_ansible_managed' from source: task vars 30529 1726882643.56857: variable '__lsr_ansible_managed' from source: task vars 30529 1726882643.56988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882643.57125: Loaded config def from plugin (lookup/template) 30529 1726882643.57128: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882643.57148: File lookup term: get_ansible_managed.j2 30529 1726882643.57151: variable 'ansible_search_path' from source: unknown 30529 1726882643.57154: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882643.57166: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882643.57180: variable 'ansible_search_path' from source: unknown 30529 1726882643.60484: variable 'ansible_managed' from source: unknown 30529 1726882643.60556: variable 'omit' from source: magic vars 30529 1726882643.60575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882643.60597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882643.60609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882643.60622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882643.60631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882643.60656: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882643.60659: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.60662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.60725: Set connection var ansible_shell_executable to /bin/sh 30529 1726882643.60729: Set connection var ansible_pipelining to False 30529 1726882643.60731: Set connection var ansible_shell_type to sh 30529 1726882643.60739: Set connection var ansible_timeout to 10 30529 1726882643.60741: Set connection var ansible_connection to ssh 30529 1726882643.60745: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882643.60764: variable 'ansible_shell_executable' from source: unknown 30529 1726882643.60767: variable 'ansible_connection' from source: unknown 30529 1726882643.60769: variable 'ansible_module_compression' from source: unknown 30529 1726882643.60772: variable 'ansible_shell_type' from source: unknown 30529 1726882643.60774: variable 'ansible_shell_executable' from source: unknown 30529 1726882643.60777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882643.60779: variable 'ansible_pipelining' from source: unknown 30529 1726882643.60781: variable 'ansible_timeout' from source: unknown 30529 1726882643.60786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882643.60872: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882643.60883: variable 'omit' from source: magic vars 30529 1726882643.60886: starting attempt loop 30529 1726882643.60891: running the handler 30529 1726882643.60903: _low_level_execute_command(): starting 30529 1726882643.60909: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882643.61395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.61399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.61403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.61405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.61459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.61462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882643.61464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.61522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.63170: stdout chunk (state=3): >>>/root <<< 30529 1726882643.63270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.63301: stderr chunk (state=3): >>><<< 30529 1726882643.63304: stdout chunk (state=3): >>><<< 30529 1726882643.63321: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882643.63330: _low_level_execute_command(): starting 30529 1726882643.63335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377 `" && echo ansible-tmp-1726882643.633208-33311-234596860203377="` echo /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377 `" ) && sleep 0' 30529 1726882643.63738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.63742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.63754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.63811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.63815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.63861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.65722: stdout chunk (state=3): >>>ansible-tmp-1726882643.633208-33311-234596860203377=/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377 <<< 30529 1726882643.65832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.65856: stderr chunk (state=3): >>><<< 30529 1726882643.65859: stdout chunk (state=3): >>><<< 30529 1726882643.65872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882643.633208-33311-234596860203377=/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882643.65905: variable 'ansible_module_compression' from source: unknown 30529 1726882643.65937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882643.65967: variable 'ansible_facts' from source: unknown 30529 1726882643.66031: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py 30529 1726882643.66126: Sending initial data 30529 1726882643.66129: Sent initial data (167 bytes) 30529 1726882643.66540: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.66544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.66558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.66620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.66622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.66658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.68161: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882643.68167: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882643.68205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882643.68247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpflhro5cs /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py <<< 30529 1726882643.68254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py" <<< 30529 1726882643.68287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpflhro5cs" to remote "/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py" <<< 30529 1726882643.68989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.69028: stderr chunk (state=3): >>><<< 30529 1726882643.69031: stdout chunk (state=3): >>><<< 30529 1726882643.69067: done transferring module to remote 30529 1726882643.69076: _low_level_execute_command(): starting 30529 1726882643.69080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/ /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py && sleep 0' 30529 1726882643.69519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.69522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882643.69525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.69529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882643.69531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882643.69533: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.69577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.69580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.69627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.71321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882643.71343: stderr chunk (state=3): >>><<< 30529 1726882643.71346: stdout chunk (state=3): >>><<< 30529 1726882643.71360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882643.71363: _low_level_execute_command(): starting 30529 1726882643.71373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/AnsiballZ_network_connections.py && sleep 0' 30529 1726882643.71776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.71779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.71781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882643.71783: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.71785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.71836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.71839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.71891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882643.96360: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882643.98247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882643.98251: stdout chunk (state=3): >>><<< 30529 1726882643.98253: stderr chunk (state=3): >>><<< 30529 1726882643.98257: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882643.98260: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882643.98262: _low_level_execute_command(): starting 30529 1726882643.98265: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882643.633208-33311-234596860203377/ > /dev/null 2>&1 && sleep 0' 30529 1726882643.98999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882643.99033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882643.99062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882643.99165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882643.99186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882643.99215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882643.99232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882643.99335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.01155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.01162: stdout chunk (state=3): >>><<< 30529 1726882644.01169: stderr chunk (state=3): >>><<< 30529 1726882644.01181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882644.01191: handler run complete 30529 1726882644.01222: attempt loop complete, returning result 30529 1726882644.01225: _execute() done 30529 1726882644.01228: dumping result to json 30529 1726882644.01230: done dumping result, returning 30529 1726882644.01239: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-00000000128a] 30529 1726882644.01241: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128a 30529 1726882644.01337: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128a 30529 1726882644.01340: WORKER PROCESS EXITING ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active 30529 1726882644.01446: no more pending results, returning what we have 30529 1726882644.01449: results queue empty 30529 1726882644.01450: checking for any_errors_fatal 30529 1726882644.01456: done checking for any_errors_fatal 30529 1726882644.01456: checking for max_fail_percentage 30529 1726882644.01458: done checking for max_fail_percentage 30529 1726882644.01459: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.01460: done checking to see if all hosts have failed 30529 1726882644.01460: getting the remaining hosts for this loop 30529 1726882644.01462: done getting the remaining hosts for this loop 30529 1726882644.01465: getting the next task for host managed_node1 30529 1726882644.01473: done getting next task for host managed_node1 30529 1726882644.01476: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882644.01480: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.01491: getting variables 30529 1726882644.01670: in VariableManager get_vars() 30529 1726882644.01750: Calling all_inventory to load vars for managed_node1 30529 1726882644.01753: Calling groups_inventory to load vars for managed_node1 30529 1726882644.01756: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.01766: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.01769: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.01772: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.02898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.03872: done with get_vars() 30529 1726882644.03887: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:24 -0400 (0:00:00.511) 0:00:58.065 ****** 30529 1726882644.03952: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882644.04183: worker is 1 (out of 1 available) 30529 1726882644.04201: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882644.04214: done queuing things up, now waiting for results queue to drain 30529 1726882644.04216: waiting for pending results... 30529 1726882644.04410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882644.04628: in run() - task 12673a56-9f93-b0f1-edc0-00000000128b 30529 1726882644.04632: variable 'ansible_search_path' from source: unknown 30529 1726882644.04636: variable 'ansible_search_path' from source: unknown 30529 1726882644.04675: calling self._execute() 30529 1726882644.04782: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.04800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.04817: variable 'omit' from source: magic vars 30529 1726882644.05242: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.05259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.05404: variable 'network_state' from source: role '' defaults 30529 1726882644.05419: Evaluated conditional (network_state != {}): False 30529 1726882644.05426: when evaluation is False, skipping this task 30529 1726882644.05433: _execute() done 30529 1726882644.05440: dumping result to json 30529 1726882644.05599: done dumping result, returning 30529 1726882644.05603: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-00000000128b] 30529 1726882644.05606: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128b 30529 1726882644.05678: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128b 30529 1726882644.05681: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882644.05744: no more pending results, returning what we have 30529 1726882644.05748: results queue empty 30529 1726882644.05749: checking for any_errors_fatal 30529 1726882644.05761: done checking for any_errors_fatal 30529 1726882644.05762: checking for max_fail_percentage 30529 1726882644.05763: done checking for max_fail_percentage 30529 1726882644.05765: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.05766: done checking to see if all hosts have failed 30529 1726882644.05766: getting the remaining hosts for this loop 30529 1726882644.05768: done getting the remaining hosts for this loop 30529 1726882644.05772: getting the next task for host managed_node1 30529 1726882644.05781: done getting next task for host managed_node1 30529 1726882644.05785: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882644.05801: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.05829: getting variables 30529 1726882644.05831: in VariableManager get_vars() 30529 1726882644.05871: Calling all_inventory to load vars for managed_node1 30529 1726882644.05873: Calling groups_inventory to load vars for managed_node1 30529 1726882644.05876: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.05888: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.06020: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.06025: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.07478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.09203: done with get_vars() 30529 1726882644.09231: done getting variables 30529 1726882644.09288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:24 -0400 (0:00:00.053) 0:00:58.119 ****** 30529 1726882644.09334: entering _queue_task() for managed_node1/debug 30529 1726882644.09907: worker is 1 (out of 1 available) 30529 1726882644.09916: exiting _queue_task() for managed_node1/debug 30529 1726882644.09925: done queuing things up, now waiting for results queue to drain 30529 1726882644.09927: waiting for pending results... 30529 1726882644.10110: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882644.10168: in run() - task 12673a56-9f93-b0f1-edc0-00000000128c 30529 1726882644.10194: variable 'ansible_search_path' from source: unknown 30529 1726882644.10204: variable 'ansible_search_path' from source: unknown 30529 1726882644.10267: calling self._execute() 30529 1726882644.10350: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.10362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.10484: variable 'omit' from source: magic vars 30529 1726882644.10784: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.10817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.10924: variable 'omit' from source: magic vars 30529 1726882644.10927: variable 'omit' from source: magic vars 30529 1726882644.10944: variable 'omit' from source: magic vars 30529 1726882644.10988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882644.11044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882644.11069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882644.11097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.11118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.11161: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882644.11170: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.11176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.11297: Set connection var ansible_shell_executable to /bin/sh 30529 1726882644.11309: Set connection var ansible_pipelining to False 30529 1726882644.11317: Set connection var ansible_shell_type to sh 30529 1726882644.11332: Set connection var ansible_timeout to 10 30529 1726882644.11340: Set connection var ansible_connection to ssh 30529 1726882644.11360: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882644.11466: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.11470: variable 'ansible_connection' from source: unknown 30529 1726882644.11473: variable 'ansible_module_compression' from source: unknown 30529 1726882644.11475: variable 'ansible_shell_type' from source: unknown 30529 1726882644.11477: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.11479: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.11481: variable 'ansible_pipelining' from source: unknown 30529 1726882644.11483: variable 'ansible_timeout' from source: unknown 30529 1726882644.11485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.11599: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882644.11618: variable 'omit' from source: magic vars 30529 1726882644.11630: starting attempt loop 30529 1726882644.11637: running the handler 30529 1726882644.11779: variable '__network_connections_result' from source: set_fact 30529 1726882644.11847: handler run complete 30529 1726882644.11868: attempt loop complete, returning result 30529 1726882644.11875: _execute() done 30529 1726882644.11882: dumping result to json 30529 1726882644.12008: done dumping result, returning 30529 1726882644.12014: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000128c] 30529 1726882644.12017: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128c 30529 1726882644.12082: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128c 30529 1726882644.12085: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active" ] } 30529 1726882644.12159: no more pending results, returning what we have 30529 1726882644.12163: results queue empty 30529 1726882644.12164: checking for any_errors_fatal 30529 1726882644.12169: done checking for any_errors_fatal 30529 1726882644.12170: checking for max_fail_percentage 30529 1726882644.12172: done checking for max_fail_percentage 30529 1726882644.12173: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.12174: done checking to see if all hosts have failed 30529 1726882644.12175: getting the remaining hosts for this loop 30529 1726882644.12176: done getting the remaining hosts for this loop 30529 1726882644.12180: getting the next task for host managed_node1 30529 1726882644.12200: done getting next task for host managed_node1 30529 1726882644.12204: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882644.12209: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.12223: getting variables 30529 1726882644.12225: in VariableManager get_vars() 30529 1726882644.12262: Calling all_inventory to load vars for managed_node1 30529 1726882644.12265: Calling groups_inventory to load vars for managed_node1 30529 1726882644.12268: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.12279: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.12284: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.12288: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.14119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.15816: done with get_vars() 30529 1726882644.15838: done getting variables 30529 1726882644.15900: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:24 -0400 (0:00:00.066) 0:00:58.185 ****** 30529 1726882644.15941: entering _queue_task() for managed_node1/debug 30529 1726882644.16248: worker is 1 (out of 1 available) 30529 1726882644.16262: exiting _queue_task() for managed_node1/debug 30529 1726882644.16277: done queuing things up, now waiting for results queue to drain 30529 1726882644.16279: waiting for pending results... 30529 1726882644.16584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882644.16682: in run() - task 12673a56-9f93-b0f1-edc0-00000000128d 30529 1726882644.16697: variable 'ansible_search_path' from source: unknown 30529 1726882644.16701: variable 'ansible_search_path' from source: unknown 30529 1726882644.16740: calling self._execute() 30529 1726882644.16814: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.16820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.16828: variable 'omit' from source: magic vars 30529 1726882644.17094: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.17103: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.17111: variable 'omit' from source: magic vars 30529 1726882644.17150: variable 'omit' from source: magic vars 30529 1726882644.17174: variable 'omit' from source: magic vars 30529 1726882644.17211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882644.17242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882644.17259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882644.17272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.17284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.17310: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882644.17313: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.17315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.17386: Set connection var ansible_shell_executable to /bin/sh 30529 1726882644.17392: Set connection var ansible_pipelining to False 30529 1726882644.17396: Set connection var ansible_shell_type to sh 30529 1726882644.17403: Set connection var ansible_timeout to 10 30529 1726882644.17405: Set connection var ansible_connection to ssh 30529 1726882644.17410: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882644.17427: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.17431: variable 'ansible_connection' from source: unknown 30529 1726882644.17434: variable 'ansible_module_compression' from source: unknown 30529 1726882644.17437: variable 'ansible_shell_type' from source: unknown 30529 1726882644.17440: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.17442: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.17444: variable 'ansible_pipelining' from source: unknown 30529 1726882644.17446: variable 'ansible_timeout' from source: unknown 30529 1726882644.17448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.17547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882644.17555: variable 'omit' from source: magic vars 30529 1726882644.17565: starting attempt loop 30529 1726882644.17568: running the handler 30529 1726882644.17603: variable '__network_connections_result' from source: set_fact 30529 1726882644.17658: variable '__network_connections_result' from source: set_fact 30529 1726882644.17743: handler run complete 30529 1726882644.17759: attempt loop complete, returning result 30529 1726882644.17762: _execute() done 30529 1726882644.17765: dumping result to json 30529 1726882644.17767: done dumping result, returning 30529 1726882644.17777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000128d] 30529 1726882644.17780: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128d 30529 1726882644.17867: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128d 30529 1726882644.17870: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 925d78f3-a59a-474c-aff9-927d62a7a239 skipped because already active" ] } } 30529 1726882644.17954: no more pending results, returning what we have 30529 1726882644.17958: results queue empty 30529 1726882644.17959: checking for any_errors_fatal 30529 1726882644.17964: done checking for any_errors_fatal 30529 1726882644.17965: checking for max_fail_percentage 30529 1726882644.17966: done checking for max_fail_percentage 30529 1726882644.17967: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.17968: done checking to see if all hosts have failed 30529 1726882644.17969: getting the remaining hosts for this loop 30529 1726882644.17970: done getting the remaining hosts for this loop 30529 1726882644.17973: getting the next task for host managed_node1 30529 1726882644.17980: done getting next task for host managed_node1 30529 1726882644.17983: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882644.17987: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.18003: getting variables 30529 1726882644.18004: in VariableManager get_vars() 30529 1726882644.18036: Calling all_inventory to load vars for managed_node1 30529 1726882644.18039: Calling groups_inventory to load vars for managed_node1 30529 1726882644.18045: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.18053: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.18056: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.18058: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.19189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.20462: done with get_vars() 30529 1726882644.20477: done getting variables 30529 1726882644.20520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:24 -0400 (0:00:00.046) 0:00:58.231 ****** 30529 1726882644.20543: entering _queue_task() for managed_node1/debug 30529 1726882644.20749: worker is 1 (out of 1 available) 30529 1726882644.20762: exiting _queue_task() for managed_node1/debug 30529 1726882644.20775: done queuing things up, now waiting for results queue to drain 30529 1726882644.20777: waiting for pending results... 30529 1726882644.20955: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882644.21049: in run() - task 12673a56-9f93-b0f1-edc0-00000000128e 30529 1726882644.21061: variable 'ansible_search_path' from source: unknown 30529 1726882644.21065: variable 'ansible_search_path' from source: unknown 30529 1726882644.21090: calling self._execute() 30529 1726882644.21170: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.21174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.21182: variable 'omit' from source: magic vars 30529 1726882644.21453: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.21462: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.21543: variable 'network_state' from source: role '' defaults 30529 1726882644.21555: Evaluated conditional (network_state != {}): False 30529 1726882644.21558: when evaluation is False, skipping this task 30529 1726882644.21561: _execute() done 30529 1726882644.21563: dumping result to json 30529 1726882644.21566: done dumping result, returning 30529 1726882644.21573: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-00000000128e] 30529 1726882644.21578: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128e 30529 1726882644.21666: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128e 30529 1726882644.21670: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882644.21714: no more pending results, returning what we have 30529 1726882644.21718: results queue empty 30529 1726882644.21719: checking for any_errors_fatal 30529 1726882644.21726: done checking for any_errors_fatal 30529 1726882644.21727: checking for max_fail_percentage 30529 1726882644.21728: done checking for max_fail_percentage 30529 1726882644.21729: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.21730: done checking to see if all hosts have failed 30529 1726882644.21731: getting the remaining hosts for this loop 30529 1726882644.21732: done getting the remaining hosts for this loop 30529 1726882644.21735: getting the next task for host managed_node1 30529 1726882644.21742: done getting next task for host managed_node1 30529 1726882644.21745: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882644.21749: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.21767: getting variables 30529 1726882644.21769: in VariableManager get_vars() 30529 1726882644.21802: Calling all_inventory to load vars for managed_node1 30529 1726882644.21804: Calling groups_inventory to load vars for managed_node1 30529 1726882644.21807: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.21815: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.21818: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.21820: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.23078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.23931: done with get_vars() 30529 1726882644.23944: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:24 -0400 (0:00:00.034) 0:00:58.266 ****** 30529 1726882644.24011: entering _queue_task() for managed_node1/ping 30529 1726882644.24201: worker is 1 (out of 1 available) 30529 1726882644.24214: exiting _queue_task() for managed_node1/ping 30529 1726882644.24226: done queuing things up, now waiting for results queue to drain 30529 1726882644.24228: waiting for pending results... 30529 1726882644.24398: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882644.24491: in run() - task 12673a56-9f93-b0f1-edc0-00000000128f 30529 1726882644.24506: variable 'ansible_search_path' from source: unknown 30529 1726882644.24510: variable 'ansible_search_path' from source: unknown 30529 1726882644.24534: calling self._execute() 30529 1726882644.24605: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.24608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.24616: variable 'omit' from source: magic vars 30529 1726882644.24869: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.24878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.24885: variable 'omit' from source: magic vars 30529 1726882644.24930: variable 'omit' from source: magic vars 30529 1726882644.24951: variable 'omit' from source: magic vars 30529 1726882644.24980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882644.25012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882644.25028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882644.25041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.25051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882644.25073: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882644.25076: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.25079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.25153: Set connection var ansible_shell_executable to /bin/sh 30529 1726882644.25156: Set connection var ansible_pipelining to False 30529 1726882644.25159: Set connection var ansible_shell_type to sh 30529 1726882644.25167: Set connection var ansible_timeout to 10 30529 1726882644.25169: Set connection var ansible_connection to ssh 30529 1726882644.25174: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882644.25190: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.25196: variable 'ansible_connection' from source: unknown 30529 1726882644.25199: variable 'ansible_module_compression' from source: unknown 30529 1726882644.25202: variable 'ansible_shell_type' from source: unknown 30529 1726882644.25204: variable 'ansible_shell_executable' from source: unknown 30529 1726882644.25207: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.25213: variable 'ansible_pipelining' from source: unknown 30529 1726882644.25215: variable 'ansible_timeout' from source: unknown 30529 1726882644.25218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.25358: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882644.25367: variable 'omit' from source: magic vars 30529 1726882644.25372: starting attempt loop 30529 1726882644.25375: running the handler 30529 1726882644.25386: _low_level_execute_command(): starting 30529 1726882644.25396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882644.25856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.25866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.25900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.25903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.25947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882644.25950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882644.25956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.26007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.27642: stdout chunk (state=3): >>>/root <<< 30529 1726882644.27744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.27767: stderr chunk (state=3): >>><<< 30529 1726882644.27770: stdout chunk (state=3): >>><<< 30529 1726882644.27791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882644.27801: _low_level_execute_command(): starting 30529 1726882644.27806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191 `" && echo ansible-tmp-1726882644.277869-33338-67716146152191="` echo /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191 `" ) && sleep 0' 30529 1726882644.28218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.28221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882644.28224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.28232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.28234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.28279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882644.28282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.28330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.30188: stdout chunk (state=3): >>>ansible-tmp-1726882644.277869-33338-67716146152191=/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191 <<< 30529 1726882644.30306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.30323: stderr chunk (state=3): >>><<< 30529 1726882644.30326: stdout chunk (state=3): >>><<< 30529 1726882644.30340: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882644.277869-33338-67716146152191=/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882644.30370: variable 'ansible_module_compression' from source: unknown 30529 1726882644.30405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882644.30435: variable 'ansible_facts' from source: unknown 30529 1726882644.30480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py 30529 1726882644.30570: Sending initial data 30529 1726882644.30574: Sent initial data (151 bytes) 30529 1726882644.30967: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882644.30976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882644.30998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.31002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882644.31010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.31062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882644.31067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.31110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.32616: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882644.32622: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882644.32654: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882644.32702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpt6p7p8_0 /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py <<< 30529 1726882644.32708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py" <<< 30529 1726882644.32739: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpt6p7p8_0" to remote "/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py" <<< 30529 1726882644.33235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.33268: stderr chunk (state=3): >>><<< 30529 1726882644.33271: stdout chunk (state=3): >>><<< 30529 1726882644.33309: done transferring module to remote 30529 1726882644.33320: _low_level_execute_command(): starting 30529 1726882644.33324: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/ /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py && sleep 0' 30529 1726882644.33730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882644.33734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882644.33736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.33738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.33743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.33788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882644.33791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.33842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.35532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.35552: stderr chunk (state=3): >>><<< 30529 1726882644.35555: stdout chunk (state=3): >>><<< 30529 1726882644.35567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882644.35571: _low_level_execute_command(): starting 30529 1726882644.35575: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/AnsiballZ_ping.py && sleep 0' 30529 1726882644.35975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.35978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.35981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882644.35983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882644.35984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.36037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882644.36044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.36086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.50879: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882644.52079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882644.52083: stdout chunk (state=3): >>><<< 30529 1726882644.52086: stderr chunk (state=3): >>><<< 30529 1726882644.52109: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882644.52229: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882644.52233: _low_level_execute_command(): starting 30529 1726882644.52235: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882644.277869-33338-67716146152191/ > /dev/null 2>&1 && sleep 0' 30529 1726882644.52974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882644.52988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882644.53007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882644.53024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882644.53039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882644.53083: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882644.53150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882644.53168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882644.53206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882644.53272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882644.55502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882644.55505: stdout chunk (state=3): >>><<< 30529 1726882644.55507: stderr chunk (state=3): >>><<< 30529 1726882644.55510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882644.55512: handler run complete 30529 1726882644.55514: attempt loop complete, returning result 30529 1726882644.55516: _execute() done 30529 1726882644.55518: dumping result to json 30529 1726882644.55520: done dumping result, returning 30529 1726882644.55522: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-00000000128f] 30529 1726882644.55524: sending task result for task 12673a56-9f93-b0f1-edc0-00000000128f 30529 1726882644.55903: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000128f 30529 1726882644.55906: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882644.55982: no more pending results, returning what we have 30529 1726882644.55986: results queue empty 30529 1726882644.55987: checking for any_errors_fatal 30529 1726882644.55999: done checking for any_errors_fatal 30529 1726882644.56001: checking for max_fail_percentage 30529 1726882644.56003: done checking for max_fail_percentage 30529 1726882644.56004: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.56005: done checking to see if all hosts have failed 30529 1726882644.56006: getting the remaining hosts for this loop 30529 1726882644.56008: done getting the remaining hosts for this loop 30529 1726882644.56012: getting the next task for host managed_node1 30529 1726882644.56031: done getting next task for host managed_node1 30529 1726882644.56034: ^ task is: TASK: meta (role_complete) 30529 1726882644.56040: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.56054: getting variables 30529 1726882644.56057: in VariableManager get_vars() 30529 1726882644.56408: Calling all_inventory to load vars for managed_node1 30529 1726882644.56412: Calling groups_inventory to load vars for managed_node1 30529 1726882644.56414: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.56426: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.56429: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.56433: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.59449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.62720: done with get_vars() 30529 1726882644.62740: done getting variables 30529 1726882644.62821: done queuing things up, now waiting for results queue to drain 30529 1726882644.62823: results queue empty 30529 1726882644.62823: checking for any_errors_fatal 30529 1726882644.62826: done checking for any_errors_fatal 30529 1726882644.62826: checking for max_fail_percentage 30529 1726882644.62827: done checking for max_fail_percentage 30529 1726882644.62828: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.62829: done checking to see if all hosts have failed 30529 1726882644.62829: getting the remaining hosts for this loop 30529 1726882644.62830: done getting the remaining hosts for this loop 30529 1726882644.62832: getting the next task for host managed_node1 30529 1726882644.62837: done getting next task for host managed_node1 30529 1726882644.62840: ^ task is: TASK: Test 30529 1726882644.62842: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.62844: getting variables 30529 1726882644.62845: in VariableManager get_vars() 30529 1726882644.62857: Calling all_inventory to load vars for managed_node1 30529 1726882644.62860: Calling groups_inventory to load vars for managed_node1 30529 1726882644.62862: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.62867: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.62869: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.62872: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.64092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.65663: done with get_vars() 30529 1726882644.65683: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:37:24 -0400 (0:00:00.417) 0:00:58.683 ****** 30529 1726882644.65770: entering _queue_task() for managed_node1/include_tasks 30529 1726882644.66050: worker is 1 (out of 1 available) 30529 1726882644.66063: exiting _queue_task() for managed_node1/include_tasks 30529 1726882644.66077: done queuing things up, now waiting for results queue to drain 30529 1726882644.66079: waiting for pending results... 30529 1726882644.66267: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882644.66344: in run() - task 12673a56-9f93-b0f1-edc0-000000001009 30529 1726882644.66356: variable 'ansible_search_path' from source: unknown 30529 1726882644.66360: variable 'ansible_search_path' from source: unknown 30529 1726882644.66398: variable 'lsr_test' from source: include params 30529 1726882644.66559: variable 'lsr_test' from source: include params 30529 1726882644.66615: variable 'omit' from source: magic vars 30529 1726882644.66713: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.66721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.66732: variable 'omit' from source: magic vars 30529 1726882644.66895: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.66905: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.66911: variable 'item' from source: unknown 30529 1726882644.66961: variable 'item' from source: unknown 30529 1726882644.66979: variable 'item' from source: unknown 30529 1726882644.67025: variable 'item' from source: unknown 30529 1726882644.67146: dumping result to json 30529 1726882644.67149: done dumping result, returning 30529 1726882644.67152: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-000000001009] 30529 1726882644.67153: sending task result for task 12673a56-9f93-b0f1-edc0-000000001009 30529 1726882644.67187: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001009 30529 1726882644.67189: WORKER PROCESS EXITING 30529 1726882644.67211: no more pending results, returning what we have 30529 1726882644.67216: in VariableManager get_vars() 30529 1726882644.67254: Calling all_inventory to load vars for managed_node1 30529 1726882644.67256: Calling groups_inventory to load vars for managed_node1 30529 1726882644.67260: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.67271: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.67274: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.67277: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.68623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.69463: done with get_vars() 30529 1726882644.69476: variable 'ansible_search_path' from source: unknown 30529 1726882644.69477: variable 'ansible_search_path' from source: unknown 30529 1726882644.69505: we have included files to process 30529 1726882644.69506: generating all_blocks data 30529 1726882644.69508: done generating all_blocks data 30529 1726882644.69512: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882644.69512: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882644.69514: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882644.69631: done processing included file 30529 1726882644.69632: iterating over new_blocks loaded from include file 30529 1726882644.69633: in VariableManager get_vars() 30529 1726882644.69645: done with get_vars() 30529 1726882644.69647: filtering new block on tags 30529 1726882644.69663: done filtering new block on tags 30529 1726882644.69664: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node1 => (item=tasks/remove_profile.yml) 30529 1726882644.69668: extending task lists for all hosts with included blocks 30529 1726882644.70138: done extending task lists 30529 1726882644.70139: done processing included files 30529 1726882644.70140: results queue empty 30529 1726882644.70140: checking for any_errors_fatal 30529 1726882644.70141: done checking for any_errors_fatal 30529 1726882644.70142: checking for max_fail_percentage 30529 1726882644.70142: done checking for max_fail_percentage 30529 1726882644.70143: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.70143: done checking to see if all hosts have failed 30529 1726882644.70144: getting the remaining hosts for this loop 30529 1726882644.70145: done getting the remaining hosts for this loop 30529 1726882644.70146: getting the next task for host managed_node1 30529 1726882644.70150: done getting next task for host managed_node1 30529 1726882644.70151: ^ task is: TASK: Include network role 30529 1726882644.70153: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.70154: getting variables 30529 1726882644.70155: in VariableManager get_vars() 30529 1726882644.70161: Calling all_inventory to load vars for managed_node1 30529 1726882644.70163: Calling groups_inventory to load vars for managed_node1 30529 1726882644.70164: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.70168: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.70169: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.70171: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.71020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.72279: done with get_vars() 30529 1726882644.72298: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:37:24 -0400 (0:00:00.065) 0:00:58.749 ****** 30529 1726882644.72361: entering _queue_task() for managed_node1/include_role 30529 1726882644.72611: worker is 1 (out of 1 available) 30529 1726882644.72624: exiting _queue_task() for managed_node1/include_role 30529 1726882644.72636: done queuing things up, now waiting for results queue to drain 30529 1726882644.72638: waiting for pending results... 30529 1726882644.72814: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882644.72901: in run() - task 12673a56-9f93-b0f1-edc0-0000000013e8 30529 1726882644.72913: variable 'ansible_search_path' from source: unknown 30529 1726882644.72916: variable 'ansible_search_path' from source: unknown 30529 1726882644.72942: calling self._execute() 30529 1726882644.73019: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.73023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.73031: variable 'omit' from source: magic vars 30529 1726882644.73310: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.73319: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.73325: _execute() done 30529 1726882644.73328: dumping result to json 30529 1726882644.73330: done dumping result, returning 30529 1726882644.73336: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-0000000013e8] 30529 1726882644.73341: sending task result for task 12673a56-9f93-b0f1-edc0-0000000013e8 30529 1726882644.73445: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000013e8 30529 1726882644.73449: WORKER PROCESS EXITING 30529 1726882644.73474: no more pending results, returning what we have 30529 1726882644.73479: in VariableManager get_vars() 30529 1726882644.73523: Calling all_inventory to load vars for managed_node1 30529 1726882644.73526: Calling groups_inventory to load vars for managed_node1 30529 1726882644.73529: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.73541: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.73544: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.73547: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.78728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.79561: done with get_vars() 30529 1726882644.79574: variable 'ansible_search_path' from source: unknown 30529 1726882644.79575: variable 'ansible_search_path' from source: unknown 30529 1726882644.79654: variable 'omit' from source: magic vars 30529 1726882644.79679: variable 'omit' from source: magic vars 30529 1726882644.79687: variable 'omit' from source: magic vars 30529 1726882644.79690: we have included files to process 30529 1726882644.79691: generating all_blocks data 30529 1726882644.79692: done generating all_blocks data 30529 1726882644.79694: processing included file: fedora.linux_system_roles.network 30529 1726882644.79707: in VariableManager get_vars() 30529 1726882644.79716: done with get_vars() 30529 1726882644.79731: in VariableManager get_vars() 30529 1726882644.79741: done with get_vars() 30529 1726882644.79762: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882644.79829: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882644.79872: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882644.80125: in VariableManager get_vars() 30529 1726882644.80138: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882644.81328: iterating over new_blocks loaded from include file 30529 1726882644.81330: in VariableManager get_vars() 30529 1726882644.81341: done with get_vars() 30529 1726882644.81342: filtering new block on tags 30529 1726882644.81499: done filtering new block on tags 30529 1726882644.81502: in VariableManager get_vars() 30529 1726882644.81512: done with get_vars() 30529 1726882644.81513: filtering new block on tags 30529 1726882644.81525: done filtering new block on tags 30529 1726882644.81526: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882644.81529: extending task lists for all hosts with included blocks 30529 1726882644.81588: done extending task lists 30529 1726882644.81588: done processing included files 30529 1726882644.81591: results queue empty 30529 1726882644.81591: checking for any_errors_fatal 30529 1726882644.81595: done checking for any_errors_fatal 30529 1726882644.81595: checking for max_fail_percentage 30529 1726882644.81596: done checking for max_fail_percentage 30529 1726882644.81596: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.81597: done checking to see if all hosts have failed 30529 1726882644.81597: getting the remaining hosts for this loop 30529 1726882644.81598: done getting the remaining hosts for this loop 30529 1726882644.81600: getting the next task for host managed_node1 30529 1726882644.81603: done getting next task for host managed_node1 30529 1726882644.81604: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882644.81606: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.81613: getting variables 30529 1726882644.81613: in VariableManager get_vars() 30529 1726882644.81623: Calling all_inventory to load vars for managed_node1 30529 1726882644.81625: Calling groups_inventory to load vars for managed_node1 30529 1726882644.81627: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.81630: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.81632: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.81633: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.82312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.83158: done with get_vars() 30529 1726882644.83172: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:24 -0400 (0:00:00.108) 0:00:58.858 ****** 30529 1726882644.83224: entering _queue_task() for managed_node1/include_tasks 30529 1726882644.83512: worker is 1 (out of 1 available) 30529 1726882644.83524: exiting _queue_task() for managed_node1/include_tasks 30529 1726882644.83537: done queuing things up, now waiting for results queue to drain 30529 1726882644.83538: waiting for pending results... 30529 1726882644.83721: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882644.83812: in run() - task 12673a56-9f93-b0f1-edc0-00000000145f 30529 1726882644.83826: variable 'ansible_search_path' from source: unknown 30529 1726882644.83830: variable 'ansible_search_path' from source: unknown 30529 1726882644.83859: calling self._execute() 30529 1726882644.83936: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.83940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.83951: variable 'omit' from source: magic vars 30529 1726882644.84235: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.84245: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.84251: _execute() done 30529 1726882644.84256: dumping result to json 30529 1726882644.84259: done dumping result, returning 30529 1726882644.84266: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-00000000145f] 30529 1726882644.84273: sending task result for task 12673a56-9f93-b0f1-edc0-00000000145f 30529 1726882644.84361: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000145f 30529 1726882644.84364: WORKER PROCESS EXITING 30529 1726882644.84417: no more pending results, returning what we have 30529 1726882644.84422: in VariableManager get_vars() 30529 1726882644.84463: Calling all_inventory to load vars for managed_node1 30529 1726882644.84466: Calling groups_inventory to load vars for managed_node1 30529 1726882644.84468: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.84487: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.84494: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.84497: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.85292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.86171: done with get_vars() 30529 1726882644.86184: variable 'ansible_search_path' from source: unknown 30529 1726882644.86185: variable 'ansible_search_path' from source: unknown 30529 1726882644.86216: we have included files to process 30529 1726882644.86217: generating all_blocks data 30529 1726882644.86219: done generating all_blocks data 30529 1726882644.86221: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882644.86222: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882644.86223: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882644.86602: done processing included file 30529 1726882644.86604: iterating over new_blocks loaded from include file 30529 1726882644.86605: in VariableManager get_vars() 30529 1726882644.86621: done with get_vars() 30529 1726882644.86622: filtering new block on tags 30529 1726882644.86641: done filtering new block on tags 30529 1726882644.86643: in VariableManager get_vars() 30529 1726882644.86658: done with get_vars() 30529 1726882644.86659: filtering new block on tags 30529 1726882644.86684: done filtering new block on tags 30529 1726882644.86686: in VariableManager get_vars() 30529 1726882644.86704: done with get_vars() 30529 1726882644.86705: filtering new block on tags 30529 1726882644.86728: done filtering new block on tags 30529 1726882644.86729: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882644.86733: extending task lists for all hosts with included blocks 30529 1726882644.87681: done extending task lists 30529 1726882644.87683: done processing included files 30529 1726882644.87683: results queue empty 30529 1726882644.87684: checking for any_errors_fatal 30529 1726882644.87686: done checking for any_errors_fatal 30529 1726882644.87687: checking for max_fail_percentage 30529 1726882644.87687: done checking for max_fail_percentage 30529 1726882644.87688: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.87688: done checking to see if all hosts have failed 30529 1726882644.87691: getting the remaining hosts for this loop 30529 1726882644.87692: done getting the remaining hosts for this loop 30529 1726882644.87695: getting the next task for host managed_node1 30529 1726882644.87699: done getting next task for host managed_node1 30529 1726882644.87700: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882644.87703: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.87710: getting variables 30529 1726882644.87711: in VariableManager get_vars() 30529 1726882644.87719: Calling all_inventory to load vars for managed_node1 30529 1726882644.87721: Calling groups_inventory to load vars for managed_node1 30529 1726882644.87722: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.87725: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.87727: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.87730: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.88388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.89234: done with get_vars() 30529 1726882644.89251: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:24 -0400 (0:00:00.060) 0:00:58.919 ****** 30529 1726882644.89307: entering _queue_task() for managed_node1/setup 30529 1726882644.89583: worker is 1 (out of 1 available) 30529 1726882644.89601: exiting _queue_task() for managed_node1/setup 30529 1726882644.89614: done queuing things up, now waiting for results queue to drain 30529 1726882644.89616: waiting for pending results... 30529 1726882644.89801: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882644.89898: in run() - task 12673a56-9f93-b0f1-edc0-0000000014b6 30529 1726882644.89909: variable 'ansible_search_path' from source: unknown 30529 1726882644.89913: variable 'ansible_search_path' from source: unknown 30529 1726882644.89940: calling self._execute() 30529 1726882644.90017: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.90021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.90029: variable 'omit' from source: magic vars 30529 1726882644.90316: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.90326: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.90469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882644.91949: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882644.91995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882644.92026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882644.92050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882644.92070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882644.92131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882644.92155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882644.92174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882644.92202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882644.92213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882644.92253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882644.92270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882644.92287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882644.92314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882644.92324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882644.92433: variable '__network_required_facts' from source: role '' defaults 30529 1726882644.92441: variable 'ansible_facts' from source: unknown 30529 1726882644.92879: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882644.92883: when evaluation is False, skipping this task 30529 1726882644.92886: _execute() done 30529 1726882644.92888: dumping result to json 30529 1726882644.92895: done dumping result, returning 30529 1726882644.92898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-0000000014b6] 30529 1726882644.92905: sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b6 30529 1726882644.92992: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b6 30529 1726882644.92998: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882644.93058: no more pending results, returning what we have 30529 1726882644.93061: results queue empty 30529 1726882644.93062: checking for any_errors_fatal 30529 1726882644.93063: done checking for any_errors_fatal 30529 1726882644.93064: checking for max_fail_percentage 30529 1726882644.93066: done checking for max_fail_percentage 30529 1726882644.93066: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.93067: done checking to see if all hosts have failed 30529 1726882644.93068: getting the remaining hosts for this loop 30529 1726882644.93070: done getting the remaining hosts for this loop 30529 1726882644.93073: getting the next task for host managed_node1 30529 1726882644.93086: done getting next task for host managed_node1 30529 1726882644.93092: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882644.93100: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.93123: getting variables 30529 1726882644.93125: in VariableManager get_vars() 30529 1726882644.93164: Calling all_inventory to load vars for managed_node1 30529 1726882644.93167: Calling groups_inventory to load vars for managed_node1 30529 1726882644.93169: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.93178: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.93181: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.93192: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.94040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882644.95236: done with get_vars() 30529 1726882644.95260: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:24 -0400 (0:00:00.060) 0:00:58.979 ****** 30529 1726882644.95363: entering _queue_task() for managed_node1/stat 30529 1726882644.95729: worker is 1 (out of 1 available) 30529 1726882644.95742: exiting _queue_task() for managed_node1/stat 30529 1726882644.95756: done queuing things up, now waiting for results queue to drain 30529 1726882644.95757: waiting for pending results... 30529 1726882644.96045: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882644.96146: in run() - task 12673a56-9f93-b0f1-edc0-0000000014b8 30529 1726882644.96159: variable 'ansible_search_path' from source: unknown 30529 1726882644.96162: variable 'ansible_search_path' from source: unknown 30529 1726882644.96190: calling self._execute() 30529 1726882644.96270: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882644.96273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882644.96284: variable 'omit' from source: magic vars 30529 1726882644.96571: variable 'ansible_distribution_major_version' from source: facts 30529 1726882644.96583: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882644.96703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882644.96904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882644.96936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882644.96962: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882644.96990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882644.97055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882644.97073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882644.97092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882644.97115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882644.97180: variable '__network_is_ostree' from source: set_fact 30529 1726882644.97185: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882644.97188: when evaluation is False, skipping this task 30529 1726882644.97191: _execute() done 30529 1726882644.97198: dumping result to json 30529 1726882644.97202: done dumping result, returning 30529 1726882644.97212: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-0000000014b8] 30529 1726882644.97215: sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b8 30529 1726882644.97296: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b8 30529 1726882644.97299: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882644.97347: no more pending results, returning what we have 30529 1726882644.97350: results queue empty 30529 1726882644.97351: checking for any_errors_fatal 30529 1726882644.97358: done checking for any_errors_fatal 30529 1726882644.97359: checking for max_fail_percentage 30529 1726882644.97360: done checking for max_fail_percentage 30529 1726882644.97361: checking to see if all hosts have failed and the running result is not ok 30529 1726882644.97362: done checking to see if all hosts have failed 30529 1726882644.97363: getting the remaining hosts for this loop 30529 1726882644.97364: done getting the remaining hosts for this loop 30529 1726882644.97367: getting the next task for host managed_node1 30529 1726882644.97377: done getting next task for host managed_node1 30529 1726882644.97380: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882644.97386: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882644.97409: getting variables 30529 1726882644.97411: in VariableManager get_vars() 30529 1726882644.97448: Calling all_inventory to load vars for managed_node1 30529 1726882644.97451: Calling groups_inventory to load vars for managed_node1 30529 1726882644.97453: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882644.97462: Calling all_plugins_play to load vars for managed_node1 30529 1726882644.97465: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882644.97468: Calling groups_plugins_play to load vars for managed_node1 30529 1726882644.99127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882645.00823: done with get_vars() 30529 1726882645.00845: done getting variables 30529 1726882645.00912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:25 -0400 (0:00:00.055) 0:00:59.035 ****** 30529 1726882645.00949: entering _queue_task() for managed_node1/set_fact 30529 1726882645.01432: worker is 1 (out of 1 available) 30529 1726882645.01443: exiting _queue_task() for managed_node1/set_fact 30529 1726882645.01454: done queuing things up, now waiting for results queue to drain 30529 1726882645.01455: waiting for pending results... 30529 1726882645.01916: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882645.01921: in run() - task 12673a56-9f93-b0f1-edc0-0000000014b9 30529 1726882645.01925: variable 'ansible_search_path' from source: unknown 30529 1726882645.01928: variable 'ansible_search_path' from source: unknown 30529 1726882645.01931: calling self._execute() 30529 1726882645.01974: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882645.01983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882645.01998: variable 'omit' from source: magic vars 30529 1726882645.02414: variable 'ansible_distribution_major_version' from source: facts 30529 1726882645.02431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882645.02610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882645.02913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882645.02958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882645.03004: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882645.03039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882645.03136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882645.03160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882645.03191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882645.03225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882645.03326: variable '__network_is_ostree' from source: set_fact 30529 1726882645.03333: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882645.03336: when evaluation is False, skipping this task 30529 1726882645.03339: _execute() done 30529 1726882645.03342: dumping result to json 30529 1726882645.03345: done dumping result, returning 30529 1726882645.03414: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-0000000014b9] 30529 1726882645.03418: sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b9 30529 1726882645.03482: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000014b9 30529 1726882645.03486: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882645.03726: no more pending results, returning what we have 30529 1726882645.03730: results queue empty 30529 1726882645.03731: checking for any_errors_fatal 30529 1726882645.03736: done checking for any_errors_fatal 30529 1726882645.03736: checking for max_fail_percentage 30529 1726882645.03738: done checking for max_fail_percentage 30529 1726882645.03739: checking to see if all hosts have failed and the running result is not ok 30529 1726882645.03740: done checking to see if all hosts have failed 30529 1726882645.03741: getting the remaining hosts for this loop 30529 1726882645.03742: done getting the remaining hosts for this loop 30529 1726882645.03746: getting the next task for host managed_node1 30529 1726882645.03755: done getting next task for host managed_node1 30529 1726882645.03759: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882645.03766: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882645.03785: getting variables 30529 1726882645.03787: in VariableManager get_vars() 30529 1726882645.03825: Calling all_inventory to load vars for managed_node1 30529 1726882645.03828: Calling groups_inventory to load vars for managed_node1 30529 1726882645.03830: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882645.03840: Calling all_plugins_play to load vars for managed_node1 30529 1726882645.03843: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882645.03846: Calling groups_plugins_play to load vars for managed_node1 30529 1726882645.05466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882645.07120: done with get_vars() 30529 1726882645.07148: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:25 -0400 (0:00:00.063) 0:00:59.098 ****** 30529 1726882645.07254: entering _queue_task() for managed_node1/service_facts 30529 1726882645.07723: worker is 1 (out of 1 available) 30529 1726882645.07733: exiting _queue_task() for managed_node1/service_facts 30529 1726882645.07744: done queuing things up, now waiting for results queue to drain 30529 1726882645.07745: waiting for pending results... 30529 1726882645.08215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882645.08221: in run() - task 12673a56-9f93-b0f1-edc0-0000000014bb 30529 1726882645.08225: variable 'ansible_search_path' from source: unknown 30529 1726882645.08230: variable 'ansible_search_path' from source: unknown 30529 1726882645.08233: calling self._execute() 30529 1726882645.08301: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882645.08307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882645.08318: variable 'omit' from source: magic vars 30529 1726882645.08741: variable 'ansible_distribution_major_version' from source: facts 30529 1726882645.08753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882645.08760: variable 'omit' from source: magic vars 30529 1726882645.08853: variable 'omit' from source: magic vars 30529 1726882645.08898: variable 'omit' from source: magic vars 30529 1726882645.08938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882645.08999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882645.09009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882645.09024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882645.09036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882645.09065: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882645.09068: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882645.09071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882645.09186: Set connection var ansible_shell_executable to /bin/sh 30529 1726882645.09190: Set connection var ansible_pipelining to False 30529 1726882645.09198: Set connection var ansible_shell_type to sh 30529 1726882645.09217: Set connection var ansible_timeout to 10 30529 1726882645.09220: Set connection var ansible_connection to ssh 30529 1726882645.09227: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882645.09249: variable 'ansible_shell_executable' from source: unknown 30529 1726882645.09252: variable 'ansible_connection' from source: unknown 30529 1726882645.09255: variable 'ansible_module_compression' from source: unknown 30529 1726882645.09258: variable 'ansible_shell_type' from source: unknown 30529 1726882645.09260: variable 'ansible_shell_executable' from source: unknown 30529 1726882645.09262: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882645.09265: variable 'ansible_pipelining' from source: unknown 30529 1726882645.09267: variable 'ansible_timeout' from source: unknown 30529 1726882645.09323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882645.09484: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882645.09501: variable 'omit' from source: magic vars 30529 1726882645.09506: starting attempt loop 30529 1726882645.09509: running the handler 30529 1726882645.09526: _low_level_execute_command(): starting 30529 1726882645.09529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882645.10398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882645.10402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882645.10480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882645.10485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882645.10487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882645.10489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882645.10528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882645.12211: stdout chunk (state=3): >>>/root <<< 30529 1726882645.12370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882645.12374: stdout chunk (state=3): >>><<< 30529 1726882645.12376: stderr chunk (state=3): >>><<< 30529 1726882645.12405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882645.12503: _low_level_execute_command(): starting 30529 1726882645.12507: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535 `" && echo ansible-tmp-1726882645.1241179-33371-196423003505535="` echo /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535 `" ) && sleep 0' 30529 1726882645.13066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882645.13079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882645.13108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882645.13164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882645.13239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882645.13275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882645.13309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882645.13357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882645.15229: stdout chunk (state=3): >>>ansible-tmp-1726882645.1241179-33371-196423003505535=/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535 <<< 30529 1726882645.15473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882645.15476: stdout chunk (state=3): >>><<< 30529 1726882645.15479: stderr chunk (state=3): >>><<< 30529 1726882645.15482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882645.1241179-33371-196423003505535=/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882645.15485: variable 'ansible_module_compression' from source: unknown 30529 1726882645.15516: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882645.15558: variable 'ansible_facts' from source: unknown 30529 1726882645.15664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py 30529 1726882645.15819: Sending initial data 30529 1726882645.15828: Sent initial data (162 bytes) 30529 1726882645.16478: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882645.16573: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882645.16603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882645.16672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882645.18223: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882645.18288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882645.18337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0bn6g5b0 /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py <<< 30529 1726882645.18340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py" <<< 30529 1726882645.18421: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp0bn6g5b0" to remote "/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py" <<< 30529 1726882645.19240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882645.19355: stderr chunk (state=3): >>><<< 30529 1726882645.19358: stdout chunk (state=3): >>><<< 30529 1726882645.19360: done transferring module to remote 30529 1726882645.19363: _low_level_execute_command(): starting 30529 1726882645.19365: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/ /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py && sleep 0' 30529 1726882645.19900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882645.19914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882645.19935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882645.19951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882645.20008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882645.20067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882645.20086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882645.20107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882645.20173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882645.22098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882645.22102: stdout chunk (state=3): >>><<< 30529 1726882645.22105: stderr chunk (state=3): >>><<< 30529 1726882645.22107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882645.22110: _low_level_execute_command(): starting 30529 1726882645.22113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/AnsiballZ_service_facts.py && sleep 0' 30529 1726882645.22540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882645.22554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882645.22559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882645.22573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882645.22585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882645.22595: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882645.22603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882645.22617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882645.22625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882645.22631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882645.22666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882645.22669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882645.22671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882645.22674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882645.22676: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882645.22678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882645.22746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882645.22775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882645.22778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882645.22841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882646.73637: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882646.75036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882646.75043: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882646.75090: stderr chunk (state=3): >>><<< 30529 1726882646.75108: stdout chunk (state=3): >>><<< 30529 1726882646.75201: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882646.75919: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882646.75929: _low_level_execute_command(): starting 30529 1726882646.75942: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882645.1241179-33371-196423003505535/ > /dev/null 2>&1 && sleep 0' 30529 1726882646.76604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882646.76741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882646.77023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882646.77319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882646.77598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882646.79180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882646.79183: stdout chunk (state=3): >>><<< 30529 1726882646.79306: stderr chunk (state=3): >>><<< 30529 1726882646.79319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882646.79325: handler run complete 30529 1726882646.79728: variable 'ansible_facts' from source: unknown 30529 1726882646.79968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882646.80930: variable 'ansible_facts' from source: unknown 30529 1726882646.81325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882646.81755: attempt loop complete, returning result 30529 1726882646.81761: _execute() done 30529 1726882646.81764: dumping result to json 30529 1726882646.81915: done dumping result, returning 30529 1726882646.81928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-0000000014bb] 30529 1726882646.81934: sending task result for task 12673a56-9f93-b0f1-edc0-0000000014bb 30529 1726882646.83519: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000014bb 30529 1726882646.83522: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882646.83668: no more pending results, returning what we have 30529 1726882646.83671: results queue empty 30529 1726882646.83672: checking for any_errors_fatal 30529 1726882646.83676: done checking for any_errors_fatal 30529 1726882646.83676: checking for max_fail_percentage 30529 1726882646.83678: done checking for max_fail_percentage 30529 1726882646.83679: checking to see if all hosts have failed and the running result is not ok 30529 1726882646.83680: done checking to see if all hosts have failed 30529 1726882646.83680: getting the remaining hosts for this loop 30529 1726882646.83681: done getting the remaining hosts for this loop 30529 1726882646.83685: getting the next task for host managed_node1 30529 1726882646.83698: done getting next task for host managed_node1 30529 1726882646.83701: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882646.83707: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882646.83717: getting variables 30529 1726882646.83718: in VariableManager get_vars() 30529 1726882646.83746: Calling all_inventory to load vars for managed_node1 30529 1726882646.83749: Calling groups_inventory to load vars for managed_node1 30529 1726882646.83751: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882646.83759: Calling all_plugins_play to load vars for managed_node1 30529 1726882646.83762: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882646.83764: Calling groups_plugins_play to load vars for managed_node1 30529 1726882646.85426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882646.88354: done with get_vars() 30529 1726882646.88377: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:26 -0400 (0:00:01.812) 0:01:00.911 ****** 30529 1726882646.88498: entering _queue_task() for managed_node1/package_facts 30529 1726882646.89294: worker is 1 (out of 1 available) 30529 1726882646.89308: exiting _queue_task() for managed_node1/package_facts 30529 1726882646.89322: done queuing things up, now waiting for results queue to drain 30529 1726882646.89324: waiting for pending results... 30529 1726882646.89779: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882646.90150: in run() - task 12673a56-9f93-b0f1-edc0-0000000014bc 30529 1726882646.90170: variable 'ansible_search_path' from source: unknown 30529 1726882646.90177: variable 'ansible_search_path' from source: unknown 30529 1726882646.90215: calling self._execute() 30529 1726882646.90484: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882646.90698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882646.90703: variable 'omit' from source: magic vars 30529 1726882646.91067: variable 'ansible_distribution_major_version' from source: facts 30529 1726882646.91498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882646.91501: variable 'omit' from source: magic vars 30529 1726882646.91504: variable 'omit' from source: magic vars 30529 1726882646.91506: variable 'omit' from source: magic vars 30529 1726882646.91508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882646.91511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882646.91718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882646.91741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882646.91759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882646.91803: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882646.91813: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882646.91822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882646.92128: Set connection var ansible_shell_executable to /bin/sh 30529 1726882646.92139: Set connection var ansible_pipelining to False 30529 1726882646.92144: Set connection var ansible_shell_type to sh 30529 1726882646.92155: Set connection var ansible_timeout to 10 30529 1726882646.92161: Set connection var ansible_connection to ssh 30529 1726882646.92298: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882646.92301: variable 'ansible_shell_executable' from source: unknown 30529 1726882646.92303: variable 'ansible_connection' from source: unknown 30529 1726882646.92306: variable 'ansible_module_compression' from source: unknown 30529 1726882646.92308: variable 'ansible_shell_type' from source: unknown 30529 1726882646.92309: variable 'ansible_shell_executable' from source: unknown 30529 1726882646.92311: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882646.92313: variable 'ansible_pipelining' from source: unknown 30529 1726882646.92315: variable 'ansible_timeout' from source: unknown 30529 1726882646.92317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882646.92739: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882646.92743: variable 'omit' from source: magic vars 30529 1726882646.92746: starting attempt loop 30529 1726882646.92748: running the handler 30529 1726882646.92750: _low_level_execute_command(): starting 30529 1726882646.92760: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882646.93609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882646.93635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882646.93655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882646.93668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882646.93812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882646.95476: stdout chunk (state=3): >>>/root <<< 30529 1726882646.95544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882646.95547: stdout chunk (state=3): >>><<< 30529 1726882646.95557: stderr chunk (state=3): >>><<< 30529 1726882646.95575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882646.95587: _low_level_execute_command(): starting 30529 1726882646.95599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241 `" && echo ansible-tmp-1726882646.9557395-33456-232853423891241="` echo /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241 `" ) && sleep 0' 30529 1726882646.96444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882646.96461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882646.96477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882646.96500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882646.96519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882646.96612: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882646.96631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882646.96709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882646.98773: stdout chunk (state=3): >>>ansible-tmp-1726882646.9557395-33456-232853423891241=/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241 <<< 30529 1726882646.98782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882646.98795: stdout chunk (state=3): >>><<< 30529 1726882646.98806: stderr chunk (state=3): >>><<< 30529 1726882646.98825: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882646.9557395-33456-232853423891241=/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882646.98917: variable 'ansible_module_compression' from source: unknown 30529 1726882646.99114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882646.99210: variable 'ansible_facts' from source: unknown 30529 1726882646.99439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py 30529 1726882646.99569: Sending initial data 30529 1726882646.99605: Sent initial data (162 bytes) 30529 1726882647.00170: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882647.00213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882647.00229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882647.00318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882647.00336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882647.00411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882647.02104: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882647.02112: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882647.02149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882647.02166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py" <<< 30529 1726882647.02170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp51sxzbq5 /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py <<< 30529 1726882647.02283: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp51sxzbq5" to remote "/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py" <<< 30529 1726882647.03803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882647.03853: stderr chunk (state=3): >>><<< 30529 1726882647.03856: stdout chunk (state=3): >>><<< 30529 1726882647.03878: done transferring module to remote 30529 1726882647.03882: _low_level_execute_command(): starting 30529 1726882647.03888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/ /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py && sleep 0' 30529 1726882647.04964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882647.05123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882647.05139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882647.05208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882647.07000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882647.07006: stdout chunk (state=3): >>><<< 30529 1726882647.07008: stderr chunk (state=3): >>><<< 30529 1726882647.07011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882647.07019: _low_level_execute_command(): starting 30529 1726882647.07022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/AnsiballZ_package_facts.py && sleep 0' 30529 1726882647.07547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882647.07575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882647.07579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882647.07582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882647.07598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882647.07601: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882647.07680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882647.07683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882647.07685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882647.07687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882647.07689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882647.07691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882647.07694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882647.07696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882647.07698: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882647.07699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882647.07752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882647.07763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882647.07779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882647.07852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882647.51575: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30529 1726882647.51777: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882647.51786: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882647.53315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882647.53407: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882647.53425: stderr chunk (state=3): >>><<< 30529 1726882647.53429: stdout chunk (state=3): >>><<< 30529 1726882647.53607: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882647.58408: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882647.58428: _low_level_execute_command(): starting 30529 1726882647.58431: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882646.9557395-33456-232853423891241/ > /dev/null 2>&1 && sleep 0' 30529 1726882647.59647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882647.59651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882647.59849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882647.59853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882647.59856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882647.59922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882647.59928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882647.59942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882647.60012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882647.61865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882647.61868: stderr chunk (state=3): >>><<< 30529 1726882647.61871: stdout chunk (state=3): >>><<< 30529 1726882647.61897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882647.61900: handler run complete 30529 1726882647.64059: variable 'ansible_facts' from source: unknown 30529 1726882647.64998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882647.68899: variable 'ansible_facts' from source: unknown 30529 1726882647.70203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882647.71632: attempt loop complete, returning result 30529 1726882647.71832: _execute() done 30529 1726882647.71835: dumping result to json 30529 1726882647.72124: done dumping result, returning 30529 1726882647.72139: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-0000000014bc] 30529 1726882647.72148: sending task result for task 12673a56-9f93-b0f1-edc0-0000000014bc 30529 1726882647.75339: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000014bc 30529 1726882647.75342: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882647.75475: no more pending results, returning what we have 30529 1726882647.75479: results queue empty 30529 1726882647.75480: checking for any_errors_fatal 30529 1726882647.75485: done checking for any_errors_fatal 30529 1726882647.75485: checking for max_fail_percentage 30529 1726882647.75487: done checking for max_fail_percentage 30529 1726882647.75488: checking to see if all hosts have failed and the running result is not ok 30529 1726882647.75489: done checking to see if all hosts have failed 30529 1726882647.75489: getting the remaining hosts for this loop 30529 1726882647.75491: done getting the remaining hosts for this loop 30529 1726882647.75497: getting the next task for host managed_node1 30529 1726882647.75504: done getting next task for host managed_node1 30529 1726882647.75508: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882647.75514: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882647.75526: getting variables 30529 1726882647.75528: in VariableManager get_vars() 30529 1726882647.75560: Calling all_inventory to load vars for managed_node1 30529 1726882647.75563: Calling groups_inventory to load vars for managed_node1 30529 1726882647.75565: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882647.75575: Calling all_plugins_play to load vars for managed_node1 30529 1726882647.75578: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882647.75581: Calling groups_plugins_play to load vars for managed_node1 30529 1726882647.77446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882647.79324: done with get_vars() 30529 1726882647.79353: done getting variables 30529 1726882647.79419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:27 -0400 (0:00:00.909) 0:01:01.820 ****** 30529 1726882647.79454: entering _queue_task() for managed_node1/debug 30529 1726882647.79839: worker is 1 (out of 1 available) 30529 1726882647.79852: exiting _queue_task() for managed_node1/debug 30529 1726882647.79863: done queuing things up, now waiting for results queue to drain 30529 1726882647.79865: waiting for pending results... 30529 1726882647.80146: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882647.80272: in run() - task 12673a56-9f93-b0f1-edc0-000000001460 30529 1726882647.80352: variable 'ansible_search_path' from source: unknown 30529 1726882647.80356: variable 'ansible_search_path' from source: unknown 30529 1726882647.80360: calling self._execute() 30529 1726882647.80447: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882647.80465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882647.80480: variable 'omit' from source: magic vars 30529 1726882647.80884: variable 'ansible_distribution_major_version' from source: facts 30529 1726882647.80910: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882647.80921: variable 'omit' from source: magic vars 30529 1726882647.80974: variable 'omit' from source: magic vars 30529 1726882647.81082: variable 'network_provider' from source: set_fact 30529 1726882647.81199: variable 'omit' from source: magic vars 30529 1726882647.81202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882647.81208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882647.81242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882647.81265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882647.81286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882647.81326: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882647.81342: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882647.81400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882647.81473: Set connection var ansible_shell_executable to /bin/sh 30529 1726882647.81484: Set connection var ansible_pipelining to False 30529 1726882647.81499: Set connection var ansible_shell_type to sh 30529 1726882647.81516: Set connection var ansible_timeout to 10 30529 1726882647.81524: Set connection var ansible_connection to ssh 30529 1726882647.81534: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882647.81569: variable 'ansible_shell_executable' from source: unknown 30529 1726882647.81578: variable 'ansible_connection' from source: unknown 30529 1726882647.81662: variable 'ansible_module_compression' from source: unknown 30529 1726882647.81665: variable 'ansible_shell_type' from source: unknown 30529 1726882647.81668: variable 'ansible_shell_executable' from source: unknown 30529 1726882647.81670: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882647.81672: variable 'ansible_pipelining' from source: unknown 30529 1726882647.81674: variable 'ansible_timeout' from source: unknown 30529 1726882647.81676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882647.81778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882647.81805: variable 'omit' from source: magic vars 30529 1726882647.81816: starting attempt loop 30529 1726882647.81823: running the handler 30529 1726882647.81878: handler run complete 30529 1726882647.82100: attempt loop complete, returning result 30529 1726882647.82103: _execute() done 30529 1726882647.82106: dumping result to json 30529 1726882647.82108: done dumping result, returning 30529 1726882647.82111: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000001460] 30529 1726882647.82113: sending task result for task 12673a56-9f93-b0f1-edc0-000000001460 30529 1726882647.82179: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001460 30529 1726882647.82183: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882647.82258: no more pending results, returning what we have 30529 1726882647.82261: results queue empty 30529 1726882647.82263: checking for any_errors_fatal 30529 1726882647.82273: done checking for any_errors_fatal 30529 1726882647.82274: checking for max_fail_percentage 30529 1726882647.82276: done checking for max_fail_percentage 30529 1726882647.82277: checking to see if all hosts have failed and the running result is not ok 30529 1726882647.82278: done checking to see if all hosts have failed 30529 1726882647.82278: getting the remaining hosts for this loop 30529 1726882647.82281: done getting the remaining hosts for this loop 30529 1726882647.82284: getting the next task for host managed_node1 30529 1726882647.82297: done getting next task for host managed_node1 30529 1726882647.82301: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882647.82308: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882647.82320: getting variables 30529 1726882647.82323: in VariableManager get_vars() 30529 1726882647.82360: Calling all_inventory to load vars for managed_node1 30529 1726882647.82363: Calling groups_inventory to load vars for managed_node1 30529 1726882647.82365: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882647.82376: Calling all_plugins_play to load vars for managed_node1 30529 1726882647.82380: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882647.82383: Calling groups_plugins_play to load vars for managed_node1 30529 1726882647.85357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882647.87443: done with get_vars() 30529 1726882647.87473: done getting variables 30529 1726882647.87542: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:27 -0400 (0:00:00.081) 0:01:01.901 ****** 30529 1726882647.87588: entering _queue_task() for managed_node1/fail 30529 1726882647.88265: worker is 1 (out of 1 available) 30529 1726882647.88278: exiting _queue_task() for managed_node1/fail 30529 1726882647.88596: done queuing things up, now waiting for results queue to drain 30529 1726882647.88599: waiting for pending results... 30529 1726882647.89174: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882647.89217: in run() - task 12673a56-9f93-b0f1-edc0-000000001461 30529 1726882647.89289: variable 'ansible_search_path' from source: unknown 30529 1726882647.89303: variable 'ansible_search_path' from source: unknown 30529 1726882647.89340: calling self._execute() 30529 1726882647.89516: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882647.89529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882647.89540: variable 'omit' from source: magic vars 30529 1726882647.89974: variable 'ansible_distribution_major_version' from source: facts 30529 1726882647.90000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882647.90145: variable 'network_state' from source: role '' defaults 30529 1726882647.90161: Evaluated conditional (network_state != {}): False 30529 1726882647.90169: when evaluation is False, skipping this task 30529 1726882647.90176: _execute() done 30529 1726882647.90182: dumping result to json 30529 1726882647.90188: done dumping result, returning 30529 1726882647.90207: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000001461] 30529 1726882647.90281: sending task result for task 12673a56-9f93-b0f1-edc0-000000001461 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882647.90535: no more pending results, returning what we have 30529 1726882647.90539: results queue empty 30529 1726882647.90540: checking for any_errors_fatal 30529 1726882647.90549: done checking for any_errors_fatal 30529 1726882647.90550: checking for max_fail_percentage 30529 1726882647.90551: done checking for max_fail_percentage 30529 1726882647.90552: checking to see if all hosts have failed and the running result is not ok 30529 1726882647.90553: done checking to see if all hosts have failed 30529 1726882647.90554: getting the remaining hosts for this loop 30529 1726882647.90556: done getting the remaining hosts for this loop 30529 1726882647.90559: getting the next task for host managed_node1 30529 1726882647.90568: done getting next task for host managed_node1 30529 1726882647.90572: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882647.90583: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882647.90613: getting variables 30529 1726882647.90615: in VariableManager get_vars() 30529 1726882647.90654: Calling all_inventory to load vars for managed_node1 30529 1726882647.90658: Calling groups_inventory to load vars for managed_node1 30529 1726882647.90660: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882647.90673: Calling all_plugins_play to load vars for managed_node1 30529 1726882647.90676: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882647.90679: Calling groups_plugins_play to load vars for managed_node1 30529 1726882647.91506: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001461 30529 1726882647.91509: WORKER PROCESS EXITING 30529 1726882647.92552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882647.94404: done with get_vars() 30529 1726882647.94539: done getting variables 30529 1726882647.94637: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:27 -0400 (0:00:00.070) 0:01:01.972 ****** 30529 1726882647.94671: entering _queue_task() for managed_node1/fail 30529 1726882647.95350: worker is 1 (out of 1 available) 30529 1726882647.95363: exiting _queue_task() for managed_node1/fail 30529 1726882647.95375: done queuing things up, now waiting for results queue to drain 30529 1726882647.95377: waiting for pending results... 30529 1726882647.95575: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882647.95703: in run() - task 12673a56-9f93-b0f1-edc0-000000001462 30529 1726882647.95721: variable 'ansible_search_path' from source: unknown 30529 1726882647.95725: variable 'ansible_search_path' from source: unknown 30529 1726882647.95757: calling self._execute() 30529 1726882647.95852: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882647.95857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882647.95867: variable 'omit' from source: magic vars 30529 1726882647.96242: variable 'ansible_distribution_major_version' from source: facts 30529 1726882647.96254: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882647.96383: variable 'network_state' from source: role '' defaults 30529 1726882647.96396: Evaluated conditional (network_state != {}): False 30529 1726882647.96400: when evaluation is False, skipping this task 30529 1726882647.96402: _execute() done 30529 1726882647.96405: dumping result to json 30529 1726882647.96408: done dumping result, returning 30529 1726882647.96414: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000001462] 30529 1726882647.96424: sending task result for task 12673a56-9f93-b0f1-edc0-000000001462 30529 1726882647.96512: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001462 30529 1726882647.96515: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882647.96570: no more pending results, returning what we have 30529 1726882647.96574: results queue empty 30529 1726882647.96575: checking for any_errors_fatal 30529 1726882647.96584: done checking for any_errors_fatal 30529 1726882647.96585: checking for max_fail_percentage 30529 1726882647.96586: done checking for max_fail_percentage 30529 1726882647.96587: checking to see if all hosts have failed and the running result is not ok 30529 1726882647.96588: done checking to see if all hosts have failed 30529 1726882647.96591: getting the remaining hosts for this loop 30529 1726882647.96594: done getting the remaining hosts for this loop 30529 1726882647.96598: getting the next task for host managed_node1 30529 1726882647.96606: done getting next task for host managed_node1 30529 1726882647.96609: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882647.96615: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882647.96636: getting variables 30529 1726882647.96637: in VariableManager get_vars() 30529 1726882647.96669: Calling all_inventory to load vars for managed_node1 30529 1726882647.96671: Calling groups_inventory to load vars for managed_node1 30529 1726882647.96673: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882647.96684: Calling all_plugins_play to load vars for managed_node1 30529 1726882647.96686: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882647.96691: Calling groups_plugins_play to load vars for managed_node1 30529 1726882647.98519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.01415: done with get_vars() 30529 1726882648.01438: done getting variables 30529 1726882648.01572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:28 -0400 (0:00:00.069) 0:01:02.042 ****** 30529 1726882648.01615: entering _queue_task() for managed_node1/fail 30529 1726882648.02148: worker is 1 (out of 1 available) 30529 1726882648.02160: exiting _queue_task() for managed_node1/fail 30529 1726882648.02171: done queuing things up, now waiting for results queue to drain 30529 1726882648.02173: waiting for pending results... 30529 1726882648.02537: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882648.02617: in run() - task 12673a56-9f93-b0f1-edc0-000000001463 30529 1726882648.02635: variable 'ansible_search_path' from source: unknown 30529 1726882648.02639: variable 'ansible_search_path' from source: unknown 30529 1726882648.02668: calling self._execute() 30529 1726882648.02787: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.02835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.02843: variable 'omit' from source: magic vars 30529 1726882648.03280: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.03291: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.03478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.06498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.06506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.06510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.06513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.06516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.06620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.06658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.06690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.06744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.06771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.06891: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.06913: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882648.07053: variable 'ansible_distribution' from source: facts 30529 1726882648.07056: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.07074: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882648.07373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.07405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.07447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.07479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.07517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.07547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.07576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.07605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.07642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.07721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.07727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.07825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.07828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.07830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.07841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.08203: variable 'network_connections' from source: include params 30529 1726882648.08213: variable 'interface' from source: play vars 30529 1726882648.08278: variable 'interface' from source: play vars 30529 1726882648.08299: variable 'network_state' from source: role '' defaults 30529 1726882648.08361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.08546: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.08655: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.08658: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.08661: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.08678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.08704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.08734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.08768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.08790: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882648.08794: when evaluation is False, skipping this task 30529 1726882648.08800: _execute() done 30529 1726882648.08803: dumping result to json 30529 1726882648.08805: done dumping result, returning 30529 1726882648.08814: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000001463] 30529 1726882648.08818: sending task result for task 12673a56-9f93-b0f1-edc0-000000001463 30529 1726882648.09026: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001463 30529 1726882648.09030: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882648.09075: no more pending results, returning what we have 30529 1726882648.09079: results queue empty 30529 1726882648.09080: checking for any_errors_fatal 30529 1726882648.09086: done checking for any_errors_fatal 30529 1726882648.09087: checking for max_fail_percentage 30529 1726882648.09088: done checking for max_fail_percentage 30529 1726882648.09092: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.09095: done checking to see if all hosts have failed 30529 1726882648.09096: getting the remaining hosts for this loop 30529 1726882648.09098: done getting the remaining hosts for this loop 30529 1726882648.09101: getting the next task for host managed_node1 30529 1726882648.09109: done getting next task for host managed_node1 30529 1726882648.09113: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882648.09118: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.09140: getting variables 30529 1726882648.09142: in VariableManager get_vars() 30529 1726882648.09181: Calling all_inventory to load vars for managed_node1 30529 1726882648.09184: Calling groups_inventory to load vars for managed_node1 30529 1726882648.09186: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.09386: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.09395: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.09399: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.11380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.13373: done with get_vars() 30529 1726882648.13400: done getting variables 30529 1726882648.13467: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:28 -0400 (0:00:00.118) 0:01:02.161 ****** 30529 1726882648.13510: entering _queue_task() for managed_node1/dnf 30529 1726882648.13853: worker is 1 (out of 1 available) 30529 1726882648.13866: exiting _queue_task() for managed_node1/dnf 30529 1726882648.13879: done queuing things up, now waiting for results queue to drain 30529 1726882648.13880: waiting for pending results... 30529 1726882648.14167: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882648.14343: in run() - task 12673a56-9f93-b0f1-edc0-000000001464 30529 1726882648.14347: variable 'ansible_search_path' from source: unknown 30529 1726882648.14350: variable 'ansible_search_path' from source: unknown 30529 1726882648.14400: calling self._execute() 30529 1726882648.14524: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.14531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.14535: variable 'omit' from source: magic vars 30529 1726882648.14987: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.14995: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.15310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.17677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.17763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.17804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.17848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.17874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.17963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.18001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.18027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.18074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.18144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.18219: variable 'ansible_distribution' from source: facts 30529 1726882648.18223: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.18247: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882648.18503: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.18517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.18537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.18563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.18610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.18625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.18666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.18802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.18805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.18808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.18810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.18839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.18866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.18897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.18939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.18962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.19110: variable 'network_connections' from source: include params 30529 1726882648.19123: variable 'interface' from source: play vars 30529 1726882648.19186: variable 'interface' from source: play vars 30529 1726882648.19260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.19456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.19507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.19549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.19678: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.19681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.19684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.19716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.19736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.19781: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882648.20014: variable 'network_connections' from source: include params 30529 1726882648.20018: variable 'interface' from source: play vars 30529 1726882648.20092: variable 'interface' from source: play vars 30529 1726882648.20118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882648.20121: when evaluation is False, skipping this task 30529 1726882648.20123: _execute() done 30529 1726882648.20126: dumping result to json 30529 1726882648.20128: done dumping result, returning 30529 1726882648.20130: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001464] 30529 1726882648.20133: sending task result for task 12673a56-9f93-b0f1-edc0-000000001464 30529 1726882648.20303: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001464 30529 1726882648.20305: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882648.20541: no more pending results, returning what we have 30529 1726882648.20545: results queue empty 30529 1726882648.20546: checking for any_errors_fatal 30529 1726882648.20552: done checking for any_errors_fatal 30529 1726882648.20553: checking for max_fail_percentage 30529 1726882648.20554: done checking for max_fail_percentage 30529 1726882648.20555: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.20556: done checking to see if all hosts have failed 30529 1726882648.20556: getting the remaining hosts for this loop 30529 1726882648.20558: done getting the remaining hosts for this loop 30529 1726882648.20561: getting the next task for host managed_node1 30529 1726882648.20568: done getting next task for host managed_node1 30529 1726882648.20572: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882648.20577: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.20604: getting variables 30529 1726882648.20606: in VariableManager get_vars() 30529 1726882648.20645: Calling all_inventory to load vars for managed_node1 30529 1726882648.20648: Calling groups_inventory to load vars for managed_node1 30529 1726882648.20650: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.20659: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.20664: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.20668: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.22215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.24037: done with get_vars() 30529 1726882648.24067: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882648.24147: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:28 -0400 (0:00:00.106) 0:01:02.267 ****** 30529 1726882648.24179: entering _queue_task() for managed_node1/yum 30529 1726882648.24549: worker is 1 (out of 1 available) 30529 1726882648.24561: exiting _queue_task() for managed_node1/yum 30529 1726882648.24576: done queuing things up, now waiting for results queue to drain 30529 1726882648.24577: waiting for pending results... 30529 1726882648.25060: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882648.25065: in run() - task 12673a56-9f93-b0f1-edc0-000000001465 30529 1726882648.25140: variable 'ansible_search_path' from source: unknown 30529 1726882648.25145: variable 'ansible_search_path' from source: unknown 30529 1726882648.25152: calling self._execute() 30529 1726882648.25326: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.25331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.25337: variable 'omit' from source: magic vars 30529 1726882648.25701: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.25705: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.25899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.29077: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.29158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.29198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.29250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.29263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.29360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.29369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.29401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.29440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.29468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.29551: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.29576: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882648.29685: when evaluation is False, skipping this task 30529 1726882648.29690: _execute() done 30529 1726882648.29698: dumping result to json 30529 1726882648.29700: done dumping result, returning 30529 1726882648.29702: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001465] 30529 1726882648.29704: sending task result for task 12673a56-9f93-b0f1-edc0-000000001465 30529 1726882648.29769: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001465 30529 1726882648.29772: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882648.29850: no more pending results, returning what we have 30529 1726882648.29853: results queue empty 30529 1726882648.29854: checking for any_errors_fatal 30529 1726882648.29861: done checking for any_errors_fatal 30529 1726882648.29862: checking for max_fail_percentage 30529 1726882648.29864: done checking for max_fail_percentage 30529 1726882648.29865: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.29866: done checking to see if all hosts have failed 30529 1726882648.29866: getting the remaining hosts for this loop 30529 1726882648.29868: done getting the remaining hosts for this loop 30529 1726882648.29872: getting the next task for host managed_node1 30529 1726882648.29881: done getting next task for host managed_node1 30529 1726882648.29885: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882648.29894: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.29919: getting variables 30529 1726882648.29921: in VariableManager get_vars() 30529 1726882648.29960: Calling all_inventory to load vars for managed_node1 30529 1726882648.29962: Calling groups_inventory to load vars for managed_node1 30529 1726882648.29964: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.29974: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.29977: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.29980: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.31472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.33033: done with get_vars() 30529 1726882648.33055: done getting variables 30529 1726882648.33120: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:28 -0400 (0:00:00.089) 0:01:02.357 ****** 30529 1726882648.33157: entering _queue_task() for managed_node1/fail 30529 1726882648.33481: worker is 1 (out of 1 available) 30529 1726882648.33500: exiting _queue_task() for managed_node1/fail 30529 1726882648.33515: done queuing things up, now waiting for results queue to drain 30529 1726882648.33517: waiting for pending results... 30529 1726882648.33822: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882648.33944: in run() - task 12673a56-9f93-b0f1-edc0-000000001466 30529 1726882648.33958: variable 'ansible_search_path' from source: unknown 30529 1726882648.33962: variable 'ansible_search_path' from source: unknown 30529 1726882648.33998: calling self._execute() 30529 1726882648.34132: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.34136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.34139: variable 'omit' from source: magic vars 30529 1726882648.34513: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.34517: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.34601: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.34744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.36221: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.36272: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.36319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.36340: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.36362: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.36592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.36597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.36600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.36603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.36605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.36607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.36798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.36801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.36804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.36806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.36808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.36810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.36811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.36828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.36842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.37003: variable 'network_connections' from source: include params 30529 1726882648.37013: variable 'interface' from source: play vars 30529 1726882648.37073: variable 'interface' from source: play vars 30529 1726882648.37146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.37306: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.37342: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.37369: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.37400: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.37442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.37768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.37788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.37814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.37853: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882648.38006: variable 'network_connections' from source: include params 30529 1726882648.38009: variable 'interface' from source: play vars 30529 1726882648.38056: variable 'interface' from source: play vars 30529 1726882648.38074: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882648.38078: when evaluation is False, skipping this task 30529 1726882648.38080: _execute() done 30529 1726882648.38083: dumping result to json 30529 1726882648.38085: done dumping result, returning 30529 1726882648.38094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001466] 30529 1726882648.38100: sending task result for task 12673a56-9f93-b0f1-edc0-000000001466 30529 1726882648.38189: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001466 30529 1726882648.38192: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882648.38283: no more pending results, returning what we have 30529 1726882648.38286: results queue empty 30529 1726882648.38287: checking for any_errors_fatal 30529 1726882648.38296: done checking for any_errors_fatal 30529 1726882648.38297: checking for max_fail_percentage 30529 1726882648.38298: done checking for max_fail_percentage 30529 1726882648.38299: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.38300: done checking to see if all hosts have failed 30529 1726882648.38301: getting the remaining hosts for this loop 30529 1726882648.38306: done getting the remaining hosts for this loop 30529 1726882648.38311: getting the next task for host managed_node1 30529 1726882648.38319: done getting next task for host managed_node1 30529 1726882648.38323: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882648.38328: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.38352: getting variables 30529 1726882648.38354: in VariableManager get_vars() 30529 1726882648.38388: Calling all_inventory to load vars for managed_node1 30529 1726882648.38390: Calling groups_inventory to load vars for managed_node1 30529 1726882648.38394: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.38404: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.38406: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.38409: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.44277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.45223: done with get_vars() 30529 1726882648.45241: done getting variables 30529 1726882648.45274: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:28 -0400 (0:00:00.121) 0:01:02.479 ****** 30529 1726882648.45301: entering _queue_task() for managed_node1/package 30529 1726882648.45567: worker is 1 (out of 1 available) 30529 1726882648.45580: exiting _queue_task() for managed_node1/package 30529 1726882648.45598: done queuing things up, now waiting for results queue to drain 30529 1726882648.45601: waiting for pending results... 30529 1726882648.45777: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882648.45896: in run() - task 12673a56-9f93-b0f1-edc0-000000001467 30529 1726882648.45905: variable 'ansible_search_path' from source: unknown 30529 1726882648.45908: variable 'ansible_search_path' from source: unknown 30529 1726882648.45939: calling self._execute() 30529 1726882648.46009: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.46014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.46023: variable 'omit' from source: magic vars 30529 1726882648.46314: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.46358: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.46699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.46803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.46860: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.46942: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.46982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.47115: variable 'network_packages' from source: role '' defaults 30529 1726882648.47235: variable '__network_provider_setup' from source: role '' defaults 30529 1726882648.47263: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882648.47334: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882648.47341: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882648.47388: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882648.47506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.48834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.48875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.48909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.48933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.48960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.49021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.49041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.49058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.49084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.49101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.49133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.49149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.49165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.49189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.49205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.49345: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882648.49415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.49433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.49450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.49474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.49484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.49549: variable 'ansible_python' from source: facts 30529 1726882648.49562: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882648.49619: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882648.49674: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882648.49760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.49775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.49792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.49820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.49831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.49863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.49884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.49905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.49929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.49940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.50038: variable 'network_connections' from source: include params 30529 1726882648.50042: variable 'interface' from source: play vars 30529 1726882648.50119: variable 'interface' from source: play vars 30529 1726882648.50167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.50186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.50214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.50235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.50272: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.50454: variable 'network_connections' from source: include params 30529 1726882648.50457: variable 'interface' from source: play vars 30529 1726882648.50530: variable 'interface' from source: play vars 30529 1726882648.50553: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882648.50608: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.50800: variable 'network_connections' from source: include params 30529 1726882648.50803: variable 'interface' from source: play vars 30529 1726882648.50848: variable 'interface' from source: play vars 30529 1726882648.50867: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882648.50922: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882648.51118: variable 'network_connections' from source: include params 30529 1726882648.51122: variable 'interface' from source: play vars 30529 1726882648.51165: variable 'interface' from source: play vars 30529 1726882648.51206: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882648.51248: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882648.51253: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882648.51297: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882648.51433: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882648.51736: variable 'network_connections' from source: include params 30529 1726882648.51739: variable 'interface' from source: play vars 30529 1726882648.51781: variable 'interface' from source: play vars 30529 1726882648.51787: variable 'ansible_distribution' from source: facts 30529 1726882648.51790: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.51801: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.51812: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882648.51921: variable 'ansible_distribution' from source: facts 30529 1726882648.51925: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.51928: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.51940: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882648.52047: variable 'ansible_distribution' from source: facts 30529 1726882648.52051: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.52054: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.52082: variable 'network_provider' from source: set_fact 30529 1726882648.52099: variable 'ansible_facts' from source: unknown 30529 1726882648.52528: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882648.52531: when evaluation is False, skipping this task 30529 1726882648.52534: _execute() done 30529 1726882648.52536: dumping result to json 30529 1726882648.52538: done dumping result, returning 30529 1726882648.52546: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000001467] 30529 1726882648.52549: sending task result for task 12673a56-9f93-b0f1-edc0-000000001467 30529 1726882648.52639: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001467 30529 1726882648.52642: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882648.52689: no more pending results, returning what we have 30529 1726882648.52692: results queue empty 30529 1726882648.52695: checking for any_errors_fatal 30529 1726882648.52704: done checking for any_errors_fatal 30529 1726882648.52704: checking for max_fail_percentage 30529 1726882648.52706: done checking for max_fail_percentage 30529 1726882648.52707: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.52708: done checking to see if all hosts have failed 30529 1726882648.52709: getting the remaining hosts for this loop 30529 1726882648.52710: done getting the remaining hosts for this loop 30529 1726882648.52714: getting the next task for host managed_node1 30529 1726882648.52722: done getting next task for host managed_node1 30529 1726882648.52726: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882648.52730: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.52751: getting variables 30529 1726882648.52753: in VariableManager get_vars() 30529 1726882648.52794: Calling all_inventory to load vars for managed_node1 30529 1726882648.52797: Calling groups_inventory to load vars for managed_node1 30529 1726882648.52799: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.52809: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.52812: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.52814: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.53624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.54494: done with get_vars() 30529 1726882648.54512: done getting variables 30529 1726882648.54555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:28 -0400 (0:00:00.092) 0:01:02.571 ****** 30529 1726882648.54580: entering _queue_task() for managed_node1/package 30529 1726882648.54831: worker is 1 (out of 1 available) 30529 1726882648.54844: exiting _queue_task() for managed_node1/package 30529 1726882648.54857: done queuing things up, now waiting for results queue to drain 30529 1726882648.54859: waiting for pending results... 30529 1726882648.55053: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882648.55151: in run() - task 12673a56-9f93-b0f1-edc0-000000001468 30529 1726882648.55166: variable 'ansible_search_path' from source: unknown 30529 1726882648.55169: variable 'ansible_search_path' from source: unknown 30529 1726882648.55201: calling self._execute() 30529 1726882648.55278: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.55282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.55296: variable 'omit' from source: magic vars 30529 1726882648.55583: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.55597: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.55686: variable 'network_state' from source: role '' defaults 30529 1726882648.55697: Evaluated conditional (network_state != {}): False 30529 1726882648.55700: when evaluation is False, skipping this task 30529 1726882648.55704: _execute() done 30529 1726882648.55707: dumping result to json 30529 1726882648.55710: done dumping result, returning 30529 1726882648.55717: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001468] 30529 1726882648.55722: sending task result for task 12673a56-9f93-b0f1-edc0-000000001468 30529 1726882648.55816: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001468 30529 1726882648.55819: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882648.55887: no more pending results, returning what we have 30529 1726882648.55894: results queue empty 30529 1726882648.55896: checking for any_errors_fatal 30529 1726882648.55901: done checking for any_errors_fatal 30529 1726882648.55901: checking for max_fail_percentage 30529 1726882648.55903: done checking for max_fail_percentage 30529 1726882648.55904: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.55905: done checking to see if all hosts have failed 30529 1726882648.55905: getting the remaining hosts for this loop 30529 1726882648.55907: done getting the remaining hosts for this loop 30529 1726882648.55910: getting the next task for host managed_node1 30529 1726882648.55919: done getting next task for host managed_node1 30529 1726882648.55922: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882648.55927: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.55946: getting variables 30529 1726882648.55948: in VariableManager get_vars() 30529 1726882648.55979: Calling all_inventory to load vars for managed_node1 30529 1726882648.55982: Calling groups_inventory to load vars for managed_node1 30529 1726882648.55984: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.55996: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.55999: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.56002: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.56903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.57764: done with get_vars() 30529 1726882648.57780: done getting variables 30529 1726882648.57824: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:28 -0400 (0:00:00.032) 0:01:02.604 ****** 30529 1726882648.57848: entering _queue_task() for managed_node1/package 30529 1726882648.58077: worker is 1 (out of 1 available) 30529 1726882648.58095: exiting _queue_task() for managed_node1/package 30529 1726882648.58107: done queuing things up, now waiting for results queue to drain 30529 1726882648.58109: waiting for pending results... 30529 1726882648.58278: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882648.58381: in run() - task 12673a56-9f93-b0f1-edc0-000000001469 30529 1726882648.58396: variable 'ansible_search_path' from source: unknown 30529 1726882648.58400: variable 'ansible_search_path' from source: unknown 30529 1726882648.58424: calling self._execute() 30529 1726882648.58495: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.58499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.58509: variable 'omit' from source: magic vars 30529 1726882648.58776: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.58786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.58868: variable 'network_state' from source: role '' defaults 30529 1726882648.58880: Evaluated conditional (network_state != {}): False 30529 1726882648.58883: when evaluation is False, skipping this task 30529 1726882648.58886: _execute() done 30529 1726882648.58891: dumping result to json 30529 1726882648.58896: done dumping result, returning 30529 1726882648.58899: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001469] 30529 1726882648.58901: sending task result for task 12673a56-9f93-b0f1-edc0-000000001469 30529 1726882648.58999: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001469 30529 1726882648.59001: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882648.59045: no more pending results, returning what we have 30529 1726882648.59049: results queue empty 30529 1726882648.59050: checking for any_errors_fatal 30529 1726882648.59054: done checking for any_errors_fatal 30529 1726882648.59055: checking for max_fail_percentage 30529 1726882648.59056: done checking for max_fail_percentage 30529 1726882648.59057: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.59058: done checking to see if all hosts have failed 30529 1726882648.59059: getting the remaining hosts for this loop 30529 1726882648.59060: done getting the remaining hosts for this loop 30529 1726882648.59064: getting the next task for host managed_node1 30529 1726882648.59071: done getting next task for host managed_node1 30529 1726882648.59074: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882648.59079: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.59101: getting variables 30529 1726882648.59103: in VariableManager get_vars() 30529 1726882648.59133: Calling all_inventory to load vars for managed_node1 30529 1726882648.59136: Calling groups_inventory to load vars for managed_node1 30529 1726882648.59138: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.59146: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.59148: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.59151: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.59882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.60755: done with get_vars() 30529 1726882648.60770: done getting variables 30529 1726882648.60815: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:28 -0400 (0:00:00.029) 0:01:02.634 ****** 30529 1726882648.60839: entering _queue_task() for managed_node1/service 30529 1726882648.61036: worker is 1 (out of 1 available) 30529 1726882648.61048: exiting _queue_task() for managed_node1/service 30529 1726882648.61060: done queuing things up, now waiting for results queue to drain 30529 1726882648.61062: waiting for pending results... 30529 1726882648.61225: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882648.61322: in run() - task 12673a56-9f93-b0f1-edc0-00000000146a 30529 1726882648.61332: variable 'ansible_search_path' from source: unknown 30529 1726882648.61336: variable 'ansible_search_path' from source: unknown 30529 1726882648.61361: calling self._execute() 30529 1726882648.61433: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.61436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.61446: variable 'omit' from source: magic vars 30529 1726882648.61710: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.61721: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.61802: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.61928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.63985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.64079: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.64130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.64202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.64234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.64361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.64400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.64405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.64441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.64451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.64494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.64510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.64549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.64615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.64620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.64697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.64704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.64724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.64747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.64792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.64951: variable 'network_connections' from source: include params 30529 1726882648.64961: variable 'interface' from source: play vars 30529 1726882648.65009: variable 'interface' from source: play vars 30529 1726882648.65061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.65167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.65196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.65228: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.65344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.65347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.65349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.65351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.65505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.65508: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882648.65635: variable 'network_connections' from source: include params 30529 1726882648.65639: variable 'interface' from source: play vars 30529 1726882648.65697: variable 'interface' from source: play vars 30529 1726882648.65898: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882648.65901: when evaluation is False, skipping this task 30529 1726882648.65903: _execute() done 30529 1726882648.65905: dumping result to json 30529 1726882648.65907: done dumping result, returning 30529 1726882648.65909: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000146a] 30529 1726882648.65910: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146a 30529 1726882648.65973: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146a 30529 1726882648.65981: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882648.66021: no more pending results, returning what we have 30529 1726882648.66024: results queue empty 30529 1726882648.66025: checking for any_errors_fatal 30529 1726882648.66031: done checking for any_errors_fatal 30529 1726882648.66031: checking for max_fail_percentage 30529 1726882648.66032: done checking for max_fail_percentage 30529 1726882648.66033: checking to see if all hosts have failed and the running result is not ok 30529 1726882648.66034: done checking to see if all hosts have failed 30529 1726882648.66035: getting the remaining hosts for this loop 30529 1726882648.66036: done getting the remaining hosts for this loop 30529 1726882648.66040: getting the next task for host managed_node1 30529 1726882648.66047: done getting next task for host managed_node1 30529 1726882648.66050: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882648.66055: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882648.66074: getting variables 30529 1726882648.66076: in VariableManager get_vars() 30529 1726882648.66222: Calling all_inventory to load vars for managed_node1 30529 1726882648.66224: Calling groups_inventory to load vars for managed_node1 30529 1726882648.66227: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882648.66235: Calling all_plugins_play to load vars for managed_node1 30529 1726882648.66238: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882648.66241: Calling groups_plugins_play to load vars for managed_node1 30529 1726882648.67811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882648.68692: done with get_vars() 30529 1726882648.68710: done getting variables 30529 1726882648.68750: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:28 -0400 (0:00:00.079) 0:01:02.713 ****** 30529 1726882648.68774: entering _queue_task() for managed_node1/service 30529 1726882648.69026: worker is 1 (out of 1 available) 30529 1726882648.69038: exiting _queue_task() for managed_node1/service 30529 1726882648.69050: done queuing things up, now waiting for results queue to drain 30529 1726882648.69052: waiting for pending results... 30529 1726882648.69226: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882648.69332: in run() - task 12673a56-9f93-b0f1-edc0-00000000146b 30529 1726882648.69343: variable 'ansible_search_path' from source: unknown 30529 1726882648.69347: variable 'ansible_search_path' from source: unknown 30529 1726882648.69375: calling self._execute() 30529 1726882648.69604: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.69608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.69611: variable 'omit' from source: magic vars 30529 1726882648.70372: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.70384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882648.70747: variable 'network_provider' from source: set_fact 30529 1726882648.70751: variable 'network_state' from source: role '' defaults 30529 1726882648.70763: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882648.70770: variable 'omit' from source: magic vars 30529 1726882648.70833: variable 'omit' from source: magic vars 30529 1726882648.70860: variable 'network_service_name' from source: role '' defaults 30529 1726882648.70970: variable 'network_service_name' from source: role '' defaults 30529 1726882648.71101: variable '__network_provider_setup' from source: role '' defaults 30529 1726882648.71105: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882648.71359: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882648.71362: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882648.71546: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882648.72049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882648.75399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882648.75469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882648.75509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882648.75543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882648.75571: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882648.75738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.75742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.75744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.75753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.75768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.75813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.75842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.75873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.75911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.75922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.76298: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882648.76301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.76320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.76344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.76382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.76408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.76505: variable 'ansible_python' from source: facts 30529 1726882648.76521: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882648.76698: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882648.76702: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882648.76818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.76850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.76873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.76915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.76928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.76982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882648.77009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882648.77032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.77077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882648.77090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882648.77235: variable 'network_connections' from source: include params 30529 1726882648.77244: variable 'interface' from source: play vars 30529 1726882648.77328: variable 'interface' from source: play vars 30529 1726882648.77441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882648.77651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882648.77711: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882648.77752: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882648.77791: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882648.77861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882648.77890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882648.77929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882648.77958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882648.78006: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.78381: variable 'network_connections' from source: include params 30529 1726882648.78384: variable 'interface' from source: play vars 30529 1726882648.78387: variable 'interface' from source: play vars 30529 1726882648.78420: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882648.78506: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882648.78825: variable 'network_connections' from source: include params 30529 1726882648.78829: variable 'interface' from source: play vars 30529 1726882648.78910: variable 'interface' from source: play vars 30529 1726882648.79298: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882648.79301: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882648.79304: variable 'network_connections' from source: include params 30529 1726882648.79306: variable 'interface' from source: play vars 30529 1726882648.79375: variable 'interface' from source: play vars 30529 1726882648.79427: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882648.79491: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882648.79504: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882648.79568: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882648.79804: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882648.80541: variable 'network_connections' from source: include params 30529 1726882648.80546: variable 'interface' from source: play vars 30529 1726882648.80609: variable 'interface' from source: play vars 30529 1726882648.80617: variable 'ansible_distribution' from source: facts 30529 1726882648.80620: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.80636: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.80650: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882648.80829: variable 'ansible_distribution' from source: facts 30529 1726882648.80832: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.80838: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.80858: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882648.81038: variable 'ansible_distribution' from source: facts 30529 1726882648.81041: variable '__network_rh_distros' from source: role '' defaults 30529 1726882648.81047: variable 'ansible_distribution_major_version' from source: facts 30529 1726882648.81089: variable 'network_provider' from source: set_fact 30529 1726882648.81118: variable 'omit' from source: magic vars 30529 1726882648.81143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882648.81171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882648.81202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882648.81214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882648.81225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882648.81253: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882648.81256: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.81259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.81369: Set connection var ansible_shell_executable to /bin/sh 30529 1726882648.81566: Set connection var ansible_pipelining to False 30529 1726882648.81570: Set connection var ansible_shell_type to sh 30529 1726882648.81572: Set connection var ansible_timeout to 10 30529 1726882648.81575: Set connection var ansible_connection to ssh 30529 1726882648.81577: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882648.81580: variable 'ansible_shell_executable' from source: unknown 30529 1726882648.81582: variable 'ansible_connection' from source: unknown 30529 1726882648.81583: variable 'ansible_module_compression' from source: unknown 30529 1726882648.81585: variable 'ansible_shell_type' from source: unknown 30529 1726882648.81587: variable 'ansible_shell_executable' from source: unknown 30529 1726882648.81589: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882648.81591: variable 'ansible_pipelining' from source: unknown 30529 1726882648.81595: variable 'ansible_timeout' from source: unknown 30529 1726882648.81597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882648.81600: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882648.81606: variable 'omit' from source: magic vars 30529 1726882648.81608: starting attempt loop 30529 1726882648.81611: running the handler 30529 1726882648.81664: variable 'ansible_facts' from source: unknown 30529 1726882648.82446: _low_level_execute_command(): starting 30529 1726882648.82453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882648.83190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882648.83208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882648.83291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.83328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882648.83344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882648.83359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882648.83443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882648.85133: stdout chunk (state=3): >>>/root <<< 30529 1726882648.85283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882648.85286: stdout chunk (state=3): >>><<< 30529 1726882648.85288: stderr chunk (state=3): >>><<< 30529 1726882648.85309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882648.85401: _low_level_execute_command(): starting 30529 1726882648.85405: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433 `" && echo ansible-tmp-1726882648.8531709-33524-201164294426433="` echo /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433 `" ) && sleep 0' 30529 1726882648.85931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882648.85935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882648.85937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.85945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882648.85956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.86001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882648.86004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882648.86008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882648.86052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882648.87917: stdout chunk (state=3): >>>ansible-tmp-1726882648.8531709-33524-201164294426433=/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433 <<< 30529 1726882648.88070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882648.88076: stdout chunk (state=3): >>><<< 30529 1726882648.88079: stderr chunk (state=3): >>><<< 30529 1726882648.88234: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882648.8531709-33524-201164294426433=/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882648.88238: variable 'ansible_module_compression' from source: unknown 30529 1726882648.88241: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882648.88271: variable 'ansible_facts' from source: unknown 30529 1726882648.88413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py 30529 1726882648.88517: Sending initial data 30529 1726882648.88521: Sent initial data (156 bytes) 30529 1726882648.88939: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882648.88942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882648.88948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.88951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882648.88953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.88990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882648.89012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882648.89057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882648.90600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882648.90642: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882648.90726: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8wqh0uzc /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py <<< 30529 1726882648.90730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py" <<< 30529 1726882648.90770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8wqh0uzc" to remote "/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py" <<< 30529 1726882648.92100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882648.92107: stderr chunk (state=3): >>><<< 30529 1726882648.92110: stdout chunk (state=3): >>><<< 30529 1726882648.92144: done transferring module to remote 30529 1726882648.92153: _low_level_execute_command(): starting 30529 1726882648.92157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/ /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py && sleep 0' 30529 1726882648.92567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882648.92571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882648.92604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.92607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882648.92614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882648.92616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.92657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882648.92660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882648.92709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882648.94461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882648.94465: stdout chunk (state=3): >>><<< 30529 1726882648.94467: stderr chunk (state=3): >>><<< 30529 1726882648.94483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882648.94568: _low_level_execute_command(): starting 30529 1726882648.94571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/AnsiballZ_systemd.py && sleep 0' 30529 1726882648.95138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882648.95151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882648.95162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882648.95213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882648.95220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882648.95266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.23863: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10895360", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306487808", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1818185000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882649.23920: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882649.23938: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882649.25608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882649.25666: stderr chunk (state=3): >>><<< 30529 1726882649.25669: stdout chunk (state=3): >>><<< 30529 1726882649.25717: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10895360", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306487808", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1818185000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882649.25869: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882649.25906: _low_level_execute_command(): starting 30529 1726882649.25909: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882648.8531709-33524-201164294426433/ > /dev/null 2>&1 && sleep 0' 30529 1726882649.26386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882649.26390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882649.26392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.26397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882649.26399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882649.26402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.26468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882649.26472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882649.26474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.26514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.28273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882649.28332: stderr chunk (state=3): >>><<< 30529 1726882649.28334: stdout chunk (state=3): >>><<< 30529 1726882649.28337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882649.28340: handler run complete 30529 1726882649.28381: attempt loop complete, returning result 30529 1726882649.28387: _execute() done 30529 1726882649.28392: dumping result to json 30529 1726882649.28411: done dumping result, returning 30529 1726882649.28422: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-00000000146b] 30529 1726882649.28425: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146b 30529 1726882649.28692: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146b ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882649.28766: no more pending results, returning what we have 30529 1726882649.28769: results queue empty 30529 1726882649.28770: checking for any_errors_fatal 30529 1726882649.28774: done checking for any_errors_fatal 30529 1726882649.28775: checking for max_fail_percentage 30529 1726882649.28776: done checking for max_fail_percentage 30529 1726882649.28777: checking to see if all hosts have failed and the running result is not ok 30529 1726882649.28778: done checking to see if all hosts have failed 30529 1726882649.28779: getting the remaining hosts for this loop 30529 1726882649.28780: done getting the remaining hosts for this loop 30529 1726882649.28783: getting the next task for host managed_node1 30529 1726882649.28795: done getting next task for host managed_node1 30529 1726882649.28798: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882649.28803: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882649.28820: WORKER PROCESS EXITING 30529 1726882649.28828: getting variables 30529 1726882649.28829: in VariableManager get_vars() 30529 1726882649.28862: Calling all_inventory to load vars for managed_node1 30529 1726882649.28864: Calling groups_inventory to load vars for managed_node1 30529 1726882649.28866: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882649.28875: Calling all_plugins_play to load vars for managed_node1 30529 1726882649.28878: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882649.28880: Calling groups_plugins_play to load vars for managed_node1 30529 1726882649.29865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882649.30950: done with get_vars() 30529 1726882649.30965: done getting variables 30529 1726882649.31016: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:29 -0400 (0:00:00.622) 0:01:03.336 ****** 30529 1726882649.31044: entering _queue_task() for managed_node1/service 30529 1726882649.31466: worker is 1 (out of 1 available) 30529 1726882649.31487: exiting _queue_task() for managed_node1/service 30529 1726882649.31710: done queuing things up, now waiting for results queue to drain 30529 1726882649.31712: waiting for pending results... 30529 1726882649.31859: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882649.32043: in run() - task 12673a56-9f93-b0f1-edc0-00000000146c 30529 1726882649.32048: variable 'ansible_search_path' from source: unknown 30529 1726882649.32053: variable 'ansible_search_path' from source: unknown 30529 1726882649.32079: calling self._execute() 30529 1726882649.32241: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.32245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.32248: variable 'omit' from source: magic vars 30529 1726882649.32699: variable 'ansible_distribution_major_version' from source: facts 30529 1726882649.32703: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882649.32706: variable 'network_provider' from source: set_fact 30529 1726882649.32709: Evaluated conditional (network_provider == "nm"): True 30529 1726882649.32791: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882649.32883: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882649.33051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882649.35027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882649.35075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882649.35107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882649.35136: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882649.35157: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882649.35228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882649.35250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882649.35269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882649.35301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882649.35312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882649.35347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882649.35364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882649.35382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882649.35413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882649.35423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882649.35451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882649.35468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882649.35485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882649.35516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882649.35527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882649.35625: variable 'network_connections' from source: include params 30529 1726882649.35635: variable 'interface' from source: play vars 30529 1726882649.35679: variable 'interface' from source: play vars 30529 1726882649.35737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882649.35849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882649.35875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882649.35900: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882649.35922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882649.35952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882649.35967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882649.35983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882649.36008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882649.36044: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882649.36198: variable 'network_connections' from source: include params 30529 1726882649.36206: variable 'interface' from source: play vars 30529 1726882649.36272: variable 'interface' from source: play vars 30529 1726882649.36499: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882649.36503: when evaluation is False, skipping this task 30529 1726882649.36506: _execute() done 30529 1726882649.36509: dumping result to json 30529 1726882649.36512: done dumping result, returning 30529 1726882649.36515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-00000000146c] 30529 1726882649.36526: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146c 30529 1726882649.36597: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146c 30529 1726882649.36600: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882649.36672: no more pending results, returning what we have 30529 1726882649.36676: results queue empty 30529 1726882649.36677: checking for any_errors_fatal 30529 1726882649.36704: done checking for any_errors_fatal 30529 1726882649.36705: checking for max_fail_percentage 30529 1726882649.36706: done checking for max_fail_percentage 30529 1726882649.36707: checking to see if all hosts have failed and the running result is not ok 30529 1726882649.36708: done checking to see if all hosts have failed 30529 1726882649.36709: getting the remaining hosts for this loop 30529 1726882649.36710: done getting the remaining hosts for this loop 30529 1726882649.36715: getting the next task for host managed_node1 30529 1726882649.36722: done getting next task for host managed_node1 30529 1726882649.36725: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882649.36730: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882649.36748: getting variables 30529 1726882649.36750: in VariableManager get_vars() 30529 1726882649.36784: Calling all_inventory to load vars for managed_node1 30529 1726882649.36786: Calling groups_inventory to load vars for managed_node1 30529 1726882649.36788: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882649.36884: Calling all_plugins_play to load vars for managed_node1 30529 1726882649.36888: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882649.36891: Calling groups_plugins_play to load vars for managed_node1 30529 1726882649.38226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882649.39883: done with get_vars() 30529 1726882649.39907: done getting variables 30529 1726882649.39965: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:29 -0400 (0:00:00.089) 0:01:03.426 ****** 30529 1726882649.40001: entering _queue_task() for managed_node1/service 30529 1726882649.40310: worker is 1 (out of 1 available) 30529 1726882649.40322: exiting _queue_task() for managed_node1/service 30529 1726882649.40334: done queuing things up, now waiting for results queue to drain 30529 1726882649.40336: waiting for pending results... 30529 1726882649.40713: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882649.40772: in run() - task 12673a56-9f93-b0f1-edc0-00000000146d 30529 1726882649.40798: variable 'ansible_search_path' from source: unknown 30529 1726882649.40805: variable 'ansible_search_path' from source: unknown 30529 1726882649.40839: calling self._execute() 30529 1726882649.40978: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.40989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.41015: variable 'omit' from source: magic vars 30529 1726882649.41483: variable 'ansible_distribution_major_version' from source: facts 30529 1726882649.41497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882649.41601: variable 'network_provider' from source: set_fact 30529 1726882649.41605: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882649.41608: when evaluation is False, skipping this task 30529 1726882649.41611: _execute() done 30529 1726882649.41613: dumping result to json 30529 1726882649.41618: done dumping result, returning 30529 1726882649.41626: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-00000000146d] 30529 1726882649.41630: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146d skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882649.41755: no more pending results, returning what we have 30529 1726882649.41759: results queue empty 30529 1726882649.41760: checking for any_errors_fatal 30529 1726882649.41767: done checking for any_errors_fatal 30529 1726882649.41768: checking for max_fail_percentage 30529 1726882649.41770: done checking for max_fail_percentage 30529 1726882649.41771: checking to see if all hosts have failed and the running result is not ok 30529 1726882649.41772: done checking to see if all hosts have failed 30529 1726882649.41772: getting the remaining hosts for this loop 30529 1726882649.41774: done getting the remaining hosts for this loop 30529 1726882649.41777: getting the next task for host managed_node1 30529 1726882649.41786: done getting next task for host managed_node1 30529 1726882649.41790: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882649.41797: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882649.41821: getting variables 30529 1726882649.41823: in VariableManager get_vars() 30529 1726882649.41859: Calling all_inventory to load vars for managed_node1 30529 1726882649.41861: Calling groups_inventory to load vars for managed_node1 30529 1726882649.41864: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882649.41874: Calling all_plugins_play to load vars for managed_node1 30529 1726882649.41876: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882649.41879: Calling groups_plugins_play to load vars for managed_node1 30529 1726882649.42751: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146d 30529 1726882649.42755: WORKER PROCESS EXITING 30529 1726882649.42765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882649.43960: done with get_vars() 30529 1726882649.43979: done getting variables 30529 1726882649.44038: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:29 -0400 (0:00:00.040) 0:01:03.466 ****** 30529 1726882649.44073: entering _queue_task() for managed_node1/copy 30529 1726882649.44328: worker is 1 (out of 1 available) 30529 1726882649.44342: exiting _queue_task() for managed_node1/copy 30529 1726882649.44354: done queuing things up, now waiting for results queue to drain 30529 1726882649.44355: waiting for pending results... 30529 1726882649.44544: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882649.44634: in run() - task 12673a56-9f93-b0f1-edc0-00000000146e 30529 1726882649.44645: variable 'ansible_search_path' from source: unknown 30529 1726882649.44649: variable 'ansible_search_path' from source: unknown 30529 1726882649.44674: calling self._execute() 30529 1726882649.44751: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.44754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.44763: variable 'omit' from source: magic vars 30529 1726882649.45037: variable 'ansible_distribution_major_version' from source: facts 30529 1726882649.45047: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882649.45127: variable 'network_provider' from source: set_fact 30529 1726882649.45131: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882649.45134: when evaluation is False, skipping this task 30529 1726882649.45136: _execute() done 30529 1726882649.45139: dumping result to json 30529 1726882649.45141: done dumping result, returning 30529 1726882649.45152: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-00000000146e] 30529 1726882649.45154: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146e 30529 1726882649.45240: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146e 30529 1726882649.45243: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882649.45287: no more pending results, returning what we have 30529 1726882649.45290: results queue empty 30529 1726882649.45291: checking for any_errors_fatal 30529 1726882649.45296: done checking for any_errors_fatal 30529 1726882649.45297: checking for max_fail_percentage 30529 1726882649.45299: done checking for max_fail_percentage 30529 1726882649.45300: checking to see if all hosts have failed and the running result is not ok 30529 1726882649.45300: done checking to see if all hosts have failed 30529 1726882649.45301: getting the remaining hosts for this loop 30529 1726882649.45303: done getting the remaining hosts for this loop 30529 1726882649.45306: getting the next task for host managed_node1 30529 1726882649.45313: done getting next task for host managed_node1 30529 1726882649.45316: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882649.45320: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882649.45338: getting variables 30529 1726882649.45339: in VariableManager get_vars() 30529 1726882649.45370: Calling all_inventory to load vars for managed_node1 30529 1726882649.45372: Calling groups_inventory to load vars for managed_node1 30529 1726882649.45374: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882649.45382: Calling all_plugins_play to load vars for managed_node1 30529 1726882649.45384: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882649.45387: Calling groups_plugins_play to load vars for managed_node1 30529 1726882649.46123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882649.47739: done with get_vars() 30529 1726882649.47761: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:29 -0400 (0:00:00.037) 0:01:03.504 ****** 30529 1726882649.47844: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882649.48096: worker is 1 (out of 1 available) 30529 1726882649.48108: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882649.48120: done queuing things up, now waiting for results queue to drain 30529 1726882649.48121: waiting for pending results... 30529 1726882649.48511: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882649.48545: in run() - task 12673a56-9f93-b0f1-edc0-00000000146f 30529 1726882649.48566: variable 'ansible_search_path' from source: unknown 30529 1726882649.48573: variable 'ansible_search_path' from source: unknown 30529 1726882649.48616: calling self._execute() 30529 1726882649.48709: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.48724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.48738: variable 'omit' from source: magic vars 30529 1726882649.49113: variable 'ansible_distribution_major_version' from source: facts 30529 1726882649.49131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882649.49146: variable 'omit' from source: magic vars 30529 1726882649.49219: variable 'omit' from source: magic vars 30529 1726882649.49379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882649.51450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882649.51518: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882649.51566: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882649.51609: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882649.51646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882649.51731: variable 'network_provider' from source: set_fact 30529 1726882649.51871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882649.51905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882649.51936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882649.51985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882649.52079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882649.52088: variable 'omit' from source: magic vars 30529 1726882649.52198: variable 'omit' from source: magic vars 30529 1726882649.52303: variable 'network_connections' from source: include params 30529 1726882649.52320: variable 'interface' from source: play vars 30529 1726882649.52381: variable 'interface' from source: play vars 30529 1726882649.52538: variable 'omit' from source: magic vars 30529 1726882649.52552: variable '__lsr_ansible_managed' from source: task vars 30529 1726882649.52615: variable '__lsr_ansible_managed' from source: task vars 30529 1726882649.52805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882649.53021: Loaded config def from plugin (lookup/template) 30529 1726882649.53032: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882649.53068: File lookup term: get_ansible_managed.j2 30529 1726882649.53168: variable 'ansible_search_path' from source: unknown 30529 1726882649.53173: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882649.53178: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882649.53181: variable 'ansible_search_path' from source: unknown 30529 1726882649.59230: variable 'ansible_managed' from source: unknown 30529 1726882649.59364: variable 'omit' from source: magic vars 30529 1726882649.59398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882649.59431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882649.59456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882649.59478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882649.59492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882649.59524: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882649.59536: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.59545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.59649: Set connection var ansible_shell_executable to /bin/sh 30529 1726882649.59660: Set connection var ansible_pipelining to False 30529 1726882649.59667: Set connection var ansible_shell_type to sh 30529 1726882649.59680: Set connection var ansible_timeout to 10 30529 1726882649.59897: Set connection var ansible_connection to ssh 30529 1726882649.59900: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882649.59902: variable 'ansible_shell_executable' from source: unknown 30529 1726882649.59904: variable 'ansible_connection' from source: unknown 30529 1726882649.59906: variable 'ansible_module_compression' from source: unknown 30529 1726882649.59908: variable 'ansible_shell_type' from source: unknown 30529 1726882649.59909: variable 'ansible_shell_executable' from source: unknown 30529 1726882649.59912: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882649.59913: variable 'ansible_pipelining' from source: unknown 30529 1726882649.59915: variable 'ansible_timeout' from source: unknown 30529 1726882649.59917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882649.59920: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882649.59929: variable 'omit' from source: magic vars 30529 1726882649.59931: starting attempt loop 30529 1726882649.59934: running the handler 30529 1726882649.59939: _low_level_execute_command(): starting 30529 1726882649.59952: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882649.60463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882649.60479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882649.60490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.60536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882649.60557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.60605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.62232: stdout chunk (state=3): >>>/root <<< 30529 1726882649.62331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882649.62371: stderr chunk (state=3): >>><<< 30529 1726882649.62374: stdout chunk (state=3): >>><<< 30529 1726882649.62391: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882649.62405: _low_level_execute_command(): starting 30529 1726882649.62413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360 `" && echo ansible-tmp-1726882649.6239529-33554-258632898172360="` echo /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360 `" ) && sleep 0' 30529 1726882649.63100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882649.63122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.63189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.65050: stdout chunk (state=3): >>>ansible-tmp-1726882649.6239529-33554-258632898172360=/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360 <<< 30529 1726882649.65212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882649.65217: stdout chunk (state=3): >>><<< 30529 1726882649.65219: stderr chunk (state=3): >>><<< 30529 1726882649.65399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882649.6239529-33554-258632898172360=/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882649.65402: variable 'ansible_module_compression' from source: unknown 30529 1726882649.65405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882649.65407: variable 'ansible_facts' from source: unknown 30529 1726882649.65539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py 30529 1726882649.65753: Sending initial data 30529 1726882649.65756: Sent initial data (168 bytes) 30529 1726882649.66186: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882649.66201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882649.66220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.66256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882649.66268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.66314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.67836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882649.67844: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882649.67875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882649.67919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpa1bj30cg /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py <<< 30529 1726882649.67921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py" <<< 30529 1726882649.67956: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpa1bj30cg" to remote "/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py" <<< 30529 1726882649.67962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py" <<< 30529 1726882649.68926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882649.68982: stderr chunk (state=3): >>><<< 30529 1726882649.68985: stdout chunk (state=3): >>><<< 30529 1726882649.69036: done transferring module to remote 30529 1726882649.69039: _low_level_execute_command(): starting 30529 1726882649.69042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/ /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py && sleep 0' 30529 1726882649.69579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.69646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882649.69650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882649.69700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.69776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.71520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882649.71567: stderr chunk (state=3): >>><<< 30529 1726882649.71570: stdout chunk (state=3): >>><<< 30529 1726882649.71592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882649.71597: _low_level_execute_command(): starting 30529 1726882649.71600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/AnsiballZ_network_connections.py && sleep 0' 30529 1726882649.72181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882649.72184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.72187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882649.72189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882649.72249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882649.72252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882649.72311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882649.98603: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3o7en4lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3o7en4lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/925d78f3-a59a-474c-aff9-927d62a7a239: error=unknown <<< 30529 1726882649.98765: stdout chunk (state=3): >>> <<< 30529 1726882649.98770: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882650.00607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882650.00611: stdout chunk (state=3): >>><<< 30529 1726882650.00616: stderr chunk (state=3): >>><<< 30529 1726882650.00788: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3o7en4lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3o7en4lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/925d78f3-a59a-474c-aff9-927d62a7a239: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882650.00792: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882650.00797: _low_level_execute_command(): starting 30529 1726882650.00799: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882649.6239529-33554-258632898172360/ > /dev/null 2>&1 && sleep 0' 30529 1726882650.01569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.01573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.01575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.01577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.01634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.01654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.01676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.01745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.03601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.03652: stderr chunk (state=3): >>><<< 30529 1726882650.03662: stdout chunk (state=3): >>><<< 30529 1726882650.03682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.03696: handler run complete 30529 1726882650.03729: attempt loop complete, returning result 30529 1726882650.03737: _execute() done 30529 1726882650.03743: dumping result to json 30529 1726882650.03752: done dumping result, returning 30529 1726882650.03765: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-00000000146f] 30529 1726882650.03777: sending task result for task 12673a56-9f93-b0f1-edc0-00000000146f changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30529 1726882650.04103: no more pending results, returning what we have 30529 1726882650.04108: results queue empty 30529 1726882650.04109: checking for any_errors_fatal 30529 1726882650.04116: done checking for any_errors_fatal 30529 1726882650.04117: checking for max_fail_percentage 30529 1726882650.04119: done checking for max_fail_percentage 30529 1726882650.04120: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.04121: done checking to see if all hosts have failed 30529 1726882650.04122: getting the remaining hosts for this loop 30529 1726882650.04124: done getting the remaining hosts for this loop 30529 1726882650.04127: getting the next task for host managed_node1 30529 1726882650.04137: done getting next task for host managed_node1 30529 1726882650.04140: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882650.04145: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.04158: getting variables 30529 1726882650.04160: in VariableManager get_vars() 30529 1726882650.04404: Calling all_inventory to load vars for managed_node1 30529 1726882650.04407: Calling groups_inventory to load vars for managed_node1 30529 1726882650.04410: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.04417: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000146f 30529 1726882650.04421: WORKER PROCESS EXITING 30529 1726882650.04430: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.04433: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.04436: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.05883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.07388: done with get_vars() 30529 1726882650.07415: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:30 -0400 (0:00:00.596) 0:01:04.101 ****** 30529 1726882650.07504: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882650.07841: worker is 1 (out of 1 available) 30529 1726882650.07852: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882650.07866: done queuing things up, now waiting for results queue to drain 30529 1726882650.07868: waiting for pending results... 30529 1726882650.08154: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882650.08308: in run() - task 12673a56-9f93-b0f1-edc0-000000001470 30529 1726882650.08332: variable 'ansible_search_path' from source: unknown 30529 1726882650.08339: variable 'ansible_search_path' from source: unknown 30529 1726882650.08379: calling self._execute() 30529 1726882650.08487: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.08501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.08516: variable 'omit' from source: magic vars 30529 1726882650.08895: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.08913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.09037: variable 'network_state' from source: role '' defaults 30529 1726882650.09054: Evaluated conditional (network_state != {}): False 30529 1726882650.09062: when evaluation is False, skipping this task 30529 1726882650.09073: _execute() done 30529 1726882650.09085: dumping result to json 30529 1726882650.09094: done dumping result, returning 30529 1726882650.09300: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000001470] 30529 1726882650.09304: sending task result for task 12673a56-9f93-b0f1-edc0-000000001470 30529 1726882650.09371: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001470 30529 1726882650.09375: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882650.09428: no more pending results, returning what we have 30529 1726882650.09431: results queue empty 30529 1726882650.09432: checking for any_errors_fatal 30529 1726882650.09444: done checking for any_errors_fatal 30529 1726882650.09445: checking for max_fail_percentage 30529 1726882650.09446: done checking for max_fail_percentage 30529 1726882650.09447: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.09448: done checking to see if all hosts have failed 30529 1726882650.09449: getting the remaining hosts for this loop 30529 1726882650.09451: done getting the remaining hosts for this loop 30529 1726882650.09454: getting the next task for host managed_node1 30529 1726882650.09462: done getting next task for host managed_node1 30529 1726882650.09466: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882650.09471: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.09498: getting variables 30529 1726882650.09500: in VariableManager get_vars() 30529 1726882650.09541: Calling all_inventory to load vars for managed_node1 30529 1726882650.09544: Calling groups_inventory to load vars for managed_node1 30529 1726882650.09546: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.09558: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.09561: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.09564: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.11206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.12746: done with get_vars() 30529 1726882650.12769: done getting variables 30529 1726882650.12842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:30 -0400 (0:00:00.053) 0:01:04.154 ****** 30529 1726882650.12878: entering _queue_task() for managed_node1/debug 30529 1726882650.13189: worker is 1 (out of 1 available) 30529 1726882650.13206: exiting _queue_task() for managed_node1/debug 30529 1726882650.13219: done queuing things up, now waiting for results queue to drain 30529 1726882650.13220: waiting for pending results... 30529 1726882650.13404: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882650.13499: in run() - task 12673a56-9f93-b0f1-edc0-000000001471 30529 1726882650.13509: variable 'ansible_search_path' from source: unknown 30529 1726882650.13513: variable 'ansible_search_path' from source: unknown 30529 1726882650.13540: calling self._execute() 30529 1726882650.13612: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.13616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.13626: variable 'omit' from source: magic vars 30529 1726882650.13897: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.13912: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.13918: variable 'omit' from source: magic vars 30529 1726882650.13960: variable 'omit' from source: magic vars 30529 1726882650.13984: variable 'omit' from source: magic vars 30529 1726882650.14024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882650.14050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882650.14066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882650.14080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.14091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.14121: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882650.14124: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.14126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.14199: Set connection var ansible_shell_executable to /bin/sh 30529 1726882650.14206: Set connection var ansible_pipelining to False 30529 1726882650.14209: Set connection var ansible_shell_type to sh 30529 1726882650.14216: Set connection var ansible_timeout to 10 30529 1726882650.14220: Set connection var ansible_connection to ssh 30529 1726882650.14225: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882650.14241: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.14244: variable 'ansible_connection' from source: unknown 30529 1726882650.14246: variable 'ansible_module_compression' from source: unknown 30529 1726882650.14249: variable 'ansible_shell_type' from source: unknown 30529 1726882650.14251: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.14253: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.14257: variable 'ansible_pipelining' from source: unknown 30529 1726882650.14259: variable 'ansible_timeout' from source: unknown 30529 1726882650.14263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.14364: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882650.14373: variable 'omit' from source: magic vars 30529 1726882650.14378: starting attempt loop 30529 1726882650.14381: running the handler 30529 1726882650.14479: variable '__network_connections_result' from source: set_fact 30529 1726882650.14527: handler run complete 30529 1726882650.14551: attempt loop complete, returning result 30529 1726882650.14554: _execute() done 30529 1726882650.14557: dumping result to json 30529 1726882650.14559: done dumping result, returning 30529 1726882650.14561: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001471] 30529 1726882650.14563: sending task result for task 12673a56-9f93-b0f1-edc0-000000001471 30529 1726882650.14639: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001471 30529 1726882650.14642: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 30529 1726882650.14714: no more pending results, returning what we have 30529 1726882650.14718: results queue empty 30529 1726882650.14719: checking for any_errors_fatal 30529 1726882650.14728: done checking for any_errors_fatal 30529 1726882650.14728: checking for max_fail_percentage 30529 1726882650.14730: done checking for max_fail_percentage 30529 1726882650.14733: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.14734: done checking to see if all hosts have failed 30529 1726882650.14735: getting the remaining hosts for this loop 30529 1726882650.14737: done getting the remaining hosts for this loop 30529 1726882650.14740: getting the next task for host managed_node1 30529 1726882650.14748: done getting next task for host managed_node1 30529 1726882650.14752: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882650.14756: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.14768: getting variables 30529 1726882650.14769: in VariableManager get_vars() 30529 1726882650.14806: Calling all_inventory to load vars for managed_node1 30529 1726882650.14808: Calling groups_inventory to load vars for managed_node1 30529 1726882650.14810: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.14820: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.14823: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.14825: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.16097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.16972: done with get_vars() 30529 1726882650.16986: done getting variables 30529 1726882650.17028: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:30 -0400 (0:00:00.041) 0:01:04.196 ****** 30529 1726882650.17054: entering _queue_task() for managed_node1/debug 30529 1726882650.17261: worker is 1 (out of 1 available) 30529 1726882650.17273: exiting _queue_task() for managed_node1/debug 30529 1726882650.17285: done queuing things up, now waiting for results queue to drain 30529 1726882650.17287: waiting for pending results... 30529 1726882650.17459: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882650.17555: in run() - task 12673a56-9f93-b0f1-edc0-000000001472 30529 1726882650.17567: variable 'ansible_search_path' from source: unknown 30529 1726882650.17571: variable 'ansible_search_path' from source: unknown 30529 1726882650.17600: calling self._execute() 30529 1726882650.17671: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.17675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.17683: variable 'omit' from source: magic vars 30529 1726882650.17986: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.18045: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.18048: variable 'omit' from source: magic vars 30529 1726882650.18072: variable 'omit' from source: magic vars 30529 1726882650.18298: variable 'omit' from source: magic vars 30529 1726882650.18302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882650.18305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882650.18307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882650.18309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.18311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.18313: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882650.18315: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.18317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.18352: Set connection var ansible_shell_executable to /bin/sh 30529 1726882650.18357: Set connection var ansible_pipelining to False 30529 1726882650.18360: Set connection var ansible_shell_type to sh 30529 1726882650.18369: Set connection var ansible_timeout to 10 30529 1726882650.18372: Set connection var ansible_connection to ssh 30529 1726882650.18377: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882650.18398: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.18401: variable 'ansible_connection' from source: unknown 30529 1726882650.18404: variable 'ansible_module_compression' from source: unknown 30529 1726882650.18406: variable 'ansible_shell_type' from source: unknown 30529 1726882650.18409: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.18411: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.18415: variable 'ansible_pipelining' from source: unknown 30529 1726882650.18418: variable 'ansible_timeout' from source: unknown 30529 1726882650.18422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.18600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882650.18604: variable 'omit' from source: magic vars 30529 1726882650.18606: starting attempt loop 30529 1726882650.18609: running the handler 30529 1726882650.18619: variable '__network_connections_result' from source: set_fact 30529 1726882650.18697: variable '__network_connections_result' from source: set_fact 30529 1726882650.18807: handler run complete 30529 1726882650.18814: attempt loop complete, returning result 30529 1726882650.18818: _execute() done 30529 1726882650.18821: dumping result to json 30529 1726882650.18823: done dumping result, returning 30529 1726882650.18833: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001472] 30529 1726882650.18836: sending task result for task 12673a56-9f93-b0f1-edc0-000000001472 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30529 1726882650.19045: no more pending results, returning what we have 30529 1726882650.19049: results queue empty 30529 1726882650.19050: checking for any_errors_fatal 30529 1726882650.19056: done checking for any_errors_fatal 30529 1726882650.19057: checking for max_fail_percentage 30529 1726882650.19058: done checking for max_fail_percentage 30529 1726882650.19059: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.19060: done checking to see if all hosts have failed 30529 1726882650.19061: getting the remaining hosts for this loop 30529 1726882650.19063: done getting the remaining hosts for this loop 30529 1726882650.19066: getting the next task for host managed_node1 30529 1726882650.19074: done getting next task for host managed_node1 30529 1726882650.19078: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882650.19083: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.19101: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001472 30529 1726882650.19111: WORKER PROCESS EXITING 30529 1726882650.19119: getting variables 30529 1726882650.19121: in VariableManager get_vars() 30529 1726882650.19161: Calling all_inventory to load vars for managed_node1 30529 1726882650.19163: Calling groups_inventory to load vars for managed_node1 30529 1726882650.19166: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.19176: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.19179: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.19182: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.20400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.21250: done with get_vars() 30529 1726882650.21264: done getting variables 30529 1726882650.21309: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:30 -0400 (0:00:00.042) 0:01:04.239 ****** 30529 1726882650.21331: entering _queue_task() for managed_node1/debug 30529 1726882650.21553: worker is 1 (out of 1 available) 30529 1726882650.21570: exiting _queue_task() for managed_node1/debug 30529 1726882650.21583: done queuing things up, now waiting for results queue to drain 30529 1726882650.21585: waiting for pending results... 30529 1726882650.21836: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882650.22003: in run() - task 12673a56-9f93-b0f1-edc0-000000001473 30529 1726882650.22007: variable 'ansible_search_path' from source: unknown 30529 1726882650.22012: variable 'ansible_search_path' from source: unknown 30529 1726882650.22015: calling self._execute() 30529 1726882650.22101: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.22113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.22116: variable 'omit' from source: magic vars 30529 1726882650.22476: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.22550: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.22605: variable 'network_state' from source: role '' defaults 30529 1726882650.22614: Evaluated conditional (network_state != {}): False 30529 1726882650.22617: when evaluation is False, skipping this task 30529 1726882650.22620: _execute() done 30529 1726882650.22657: dumping result to json 30529 1726882650.22660: done dumping result, returning 30529 1726882650.22663: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000001473] 30529 1726882650.22665: sending task result for task 12673a56-9f93-b0f1-edc0-000000001473 30529 1726882650.22728: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001473 30529 1726882650.22731: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882650.22801: no more pending results, returning what we have 30529 1726882650.22805: results queue empty 30529 1726882650.22806: checking for any_errors_fatal 30529 1726882650.22812: done checking for any_errors_fatal 30529 1726882650.22813: checking for max_fail_percentage 30529 1726882650.22814: done checking for max_fail_percentage 30529 1726882650.22815: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.22816: done checking to see if all hosts have failed 30529 1726882650.22817: getting the remaining hosts for this loop 30529 1726882650.22818: done getting the remaining hosts for this loop 30529 1726882650.22821: getting the next task for host managed_node1 30529 1726882650.22828: done getting next task for host managed_node1 30529 1726882650.22831: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882650.22835: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.22855: getting variables 30529 1726882650.22856: in VariableManager get_vars() 30529 1726882650.22887: Calling all_inventory to load vars for managed_node1 30529 1726882650.22889: Calling groups_inventory to load vars for managed_node1 30529 1726882650.22891: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.22901: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.22904: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.22906: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.23982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.25146: done with get_vars() 30529 1726882650.25161: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:30 -0400 (0:00:00.038) 0:01:04.278 ****** 30529 1726882650.25231: entering _queue_task() for managed_node1/ping 30529 1726882650.25428: worker is 1 (out of 1 available) 30529 1726882650.25442: exiting _queue_task() for managed_node1/ping 30529 1726882650.25455: done queuing things up, now waiting for results queue to drain 30529 1726882650.25456: waiting for pending results... 30529 1726882650.25627: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882650.25719: in run() - task 12673a56-9f93-b0f1-edc0-000000001474 30529 1726882650.25731: variable 'ansible_search_path' from source: unknown 30529 1726882650.25734: variable 'ansible_search_path' from source: unknown 30529 1726882650.25760: calling self._execute() 30529 1726882650.25833: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.25836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.25845: variable 'omit' from source: magic vars 30529 1726882650.26113: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.26125: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.26129: variable 'omit' from source: magic vars 30529 1726882650.26168: variable 'omit' from source: magic vars 30529 1726882650.26189: variable 'omit' from source: magic vars 30529 1726882650.26221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882650.26250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882650.26265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882650.26278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.26288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.26315: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882650.26318: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.26320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.26430: Set connection var ansible_shell_executable to /bin/sh 30529 1726882650.26433: Set connection var ansible_pipelining to False 30529 1726882650.26436: Set connection var ansible_shell_type to sh 30529 1726882650.26438: Set connection var ansible_timeout to 10 30529 1726882650.26440: Set connection var ansible_connection to ssh 30529 1726882650.26444: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882650.26466: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.26469: variable 'ansible_connection' from source: unknown 30529 1726882650.26472: variable 'ansible_module_compression' from source: unknown 30529 1726882650.26474: variable 'ansible_shell_type' from source: unknown 30529 1726882650.26476: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.26478: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.26480: variable 'ansible_pipelining' from source: unknown 30529 1726882650.26482: variable 'ansible_timeout' from source: unknown 30529 1726882650.26484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.26799: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882650.26804: variable 'omit' from source: magic vars 30529 1726882650.26806: starting attempt loop 30529 1726882650.26808: running the handler 30529 1726882650.26811: _low_level_execute_command(): starting 30529 1726882650.26812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882650.27377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882650.27381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.27390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.27409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.27423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882650.27430: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882650.27450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.27453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882650.27473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882650.27476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882650.27514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.27516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.27518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.27531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.27582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.27610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.27666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.29347: stdout chunk (state=3): >>>/root <<< 30529 1726882650.29447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.29519: stderr chunk (state=3): >>><<< 30529 1726882650.29522: stdout chunk (state=3): >>><<< 30529 1726882650.29526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.29528: _low_level_execute_command(): starting 30529 1726882650.29531: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211 `" && echo ansible-tmp-1726882650.2948482-33594-160941107558211="` echo /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211 `" ) && sleep 0' 30529 1726882650.29912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.29922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.29925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.29928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.29961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.29965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.30018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.31858: stdout chunk (state=3): >>>ansible-tmp-1726882650.2948482-33594-160941107558211=/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211 <<< 30529 1726882650.31971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.31996: stderr chunk (state=3): >>><<< 30529 1726882650.31999: stdout chunk (state=3): >>><<< 30529 1726882650.32011: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882650.2948482-33594-160941107558211=/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.32044: variable 'ansible_module_compression' from source: unknown 30529 1726882650.32076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882650.32106: variable 'ansible_facts' from source: unknown 30529 1726882650.32157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py 30529 1726882650.32248: Sending initial data 30529 1726882650.32251: Sent initial data (153 bytes) 30529 1726882650.32665: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.32668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882650.32670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.32672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.32674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.32725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.32730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.32771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.34286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882650.34295: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882650.34325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882650.34366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgncj3569 /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py <<< 30529 1726882650.34373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py" <<< 30529 1726882650.34409: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpgncj3569" to remote "/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py" <<< 30529 1726882650.34916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.34945: stderr chunk (state=3): >>><<< 30529 1726882650.34948: stdout chunk (state=3): >>><<< 30529 1726882650.34985: done transferring module to remote 30529 1726882650.34999: _low_level_execute_command(): starting 30529 1726882650.35002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/ /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py && sleep 0' 30529 1726882650.35379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.35410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.35413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882650.35415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.35421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.35467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.35470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.35518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.37220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.37240: stderr chunk (state=3): >>><<< 30529 1726882650.37243: stdout chunk (state=3): >>><<< 30529 1726882650.37260: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.37263: _low_level_execute_command(): starting 30529 1726882650.37266: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/AnsiballZ_ping.py && sleep 0' 30529 1726882650.37666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.37669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.37671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882650.37674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.37676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.37725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.37729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.37778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.52698: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882650.53939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882650.53966: stderr chunk (state=3): >>><<< 30529 1726882650.53970: stdout chunk (state=3): >>><<< 30529 1726882650.53986: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882650.54013: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882650.54027: _low_level_execute_command(): starting 30529 1726882650.54031: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882650.2948482-33594-160941107558211/ > /dev/null 2>&1 && sleep 0' 30529 1726882650.54476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.54499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882650.54503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882650.54505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.54517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.54577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.54580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.54588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.54626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.56419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.56445: stderr chunk (state=3): >>><<< 30529 1726882650.56448: stdout chunk (state=3): >>><<< 30529 1726882650.56463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.56472: handler run complete 30529 1726882650.56484: attempt loop complete, returning result 30529 1726882650.56487: _execute() done 30529 1726882650.56491: dumping result to json 30529 1726882650.56496: done dumping result, returning 30529 1726882650.56503: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000001474] 30529 1726882650.56508: sending task result for task 12673a56-9f93-b0f1-edc0-000000001474 30529 1726882650.56601: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001474 30529 1726882650.56603: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882650.56675: no more pending results, returning what we have 30529 1726882650.56679: results queue empty 30529 1726882650.56680: checking for any_errors_fatal 30529 1726882650.56686: done checking for any_errors_fatal 30529 1726882650.56687: checking for max_fail_percentage 30529 1726882650.56691: done checking for max_fail_percentage 30529 1726882650.56692: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.56695: done checking to see if all hosts have failed 30529 1726882650.56696: getting the remaining hosts for this loop 30529 1726882650.56697: done getting the remaining hosts for this loop 30529 1726882650.56701: getting the next task for host managed_node1 30529 1726882650.56713: done getting next task for host managed_node1 30529 1726882650.56715: ^ task is: TASK: meta (role_complete) 30529 1726882650.56720: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.56732: getting variables 30529 1726882650.56733: in VariableManager get_vars() 30529 1726882650.56773: Calling all_inventory to load vars for managed_node1 30529 1726882650.56775: Calling groups_inventory to load vars for managed_node1 30529 1726882650.56777: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.56786: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.56789: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.56801: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.57712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.58572: done with get_vars() 30529 1726882650.58587: done getting variables 30529 1726882650.58648: done queuing things up, now waiting for results queue to drain 30529 1726882650.58649: results queue empty 30529 1726882650.58650: checking for any_errors_fatal 30529 1726882650.58651: done checking for any_errors_fatal 30529 1726882650.58652: checking for max_fail_percentage 30529 1726882650.58653: done checking for max_fail_percentage 30529 1726882650.58653: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.58653: done checking to see if all hosts have failed 30529 1726882650.58654: getting the remaining hosts for this loop 30529 1726882650.58654: done getting the remaining hosts for this loop 30529 1726882650.58656: getting the next task for host managed_node1 30529 1726882650.58659: done getting next task for host managed_node1 30529 1726882650.58661: ^ task is: TASK: Asserts 30529 1726882650.58662: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.58664: getting variables 30529 1726882650.58665: in VariableManager get_vars() 30529 1726882650.58672: Calling all_inventory to load vars for managed_node1 30529 1726882650.58673: Calling groups_inventory to load vars for managed_node1 30529 1726882650.58674: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.58678: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.58679: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.58680: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.59316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.60246: done with get_vars() 30529 1726882650.60260: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:37:30 -0400 (0:00:00.350) 0:01:04.629 ****** 30529 1726882650.60318: entering _queue_task() for managed_node1/include_tasks 30529 1726882650.60582: worker is 1 (out of 1 available) 30529 1726882650.60598: exiting _queue_task() for managed_node1/include_tasks 30529 1726882650.60612: done queuing things up, now waiting for results queue to drain 30529 1726882650.60614: waiting for pending results... 30529 1726882650.60788: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882650.60867: in run() - task 12673a56-9f93-b0f1-edc0-00000000100a 30529 1726882650.60878: variable 'ansible_search_path' from source: unknown 30529 1726882650.60881: variable 'ansible_search_path' from source: unknown 30529 1726882650.60919: variable 'lsr_assert' from source: include params 30529 1726882650.61081: variable 'lsr_assert' from source: include params 30529 1726882650.61138: variable 'omit' from source: magic vars 30529 1726882650.61237: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.61245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.61253: variable 'omit' from source: magic vars 30529 1726882650.61420: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.61428: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.61434: variable 'item' from source: unknown 30529 1726882650.61478: variable 'item' from source: unknown 30529 1726882650.61507: variable 'item' from source: unknown 30529 1726882650.61548: variable 'item' from source: unknown 30529 1726882650.61680: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.61684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.61686: variable 'omit' from source: magic vars 30529 1726882650.61757: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.61760: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.61765: variable 'item' from source: unknown 30529 1726882650.61812: variable 'item' from source: unknown 30529 1726882650.61832: variable 'item' from source: unknown 30529 1726882650.61873: variable 'item' from source: unknown 30529 1726882650.61940: dumping result to json 30529 1726882650.61942: done dumping result, returning 30529 1726882650.61945: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-00000000100a] 30529 1726882650.61947: sending task result for task 12673a56-9f93-b0f1-edc0-00000000100a 30529 1726882650.61978: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000100a 30529 1726882650.61980: WORKER PROCESS EXITING 30529 1726882650.62012: no more pending results, returning what we have 30529 1726882650.62017: in VariableManager get_vars() 30529 1726882650.62056: Calling all_inventory to load vars for managed_node1 30529 1726882650.62058: Calling groups_inventory to load vars for managed_node1 30529 1726882650.62061: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.62076: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.62079: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.62082: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.62874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.63727: done with get_vars() 30529 1726882650.63742: variable 'ansible_search_path' from source: unknown 30529 1726882650.63743: variable 'ansible_search_path' from source: unknown 30529 1726882650.63771: variable 'ansible_search_path' from source: unknown 30529 1726882650.63771: variable 'ansible_search_path' from source: unknown 30529 1726882650.63787: we have included files to process 30529 1726882650.63788: generating all_blocks data 30529 1726882650.63792: done generating all_blocks data 30529 1726882650.63798: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882650.63799: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882650.63801: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30529 1726882650.63874: in VariableManager get_vars() 30529 1726882650.63888: done with get_vars() 30529 1726882650.63975: done processing included file 30529 1726882650.63977: iterating over new_blocks loaded from include file 30529 1726882650.63978: in VariableManager get_vars() 30529 1726882650.63988: done with get_vars() 30529 1726882650.63989: filtering new block on tags 30529 1726882650.64013: done filtering new block on tags 30529 1726882650.64015: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 => (item=tasks/assert_device_present.yml) 30529 1726882650.64020: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882650.64020: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882650.64022: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882650.64113: in VariableManager get_vars() 30529 1726882650.64126: done with get_vars() 30529 1726882650.64185: done processing included file 30529 1726882650.64186: iterating over new_blocks loaded from include file 30529 1726882650.64187: in VariableManager get_vars() 30529 1726882650.64198: done with get_vars() 30529 1726882650.64199: filtering new block on tags 30529 1726882650.64218: done filtering new block on tags 30529 1726882650.64220: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 => (item=tasks/assert_profile_absent.yml) 30529 1726882650.64222: extending task lists for all hosts with included blocks 30529 1726882650.64792: done extending task lists 30529 1726882650.64795: done processing included files 30529 1726882650.64795: results queue empty 30529 1726882650.64796: checking for any_errors_fatal 30529 1726882650.64797: done checking for any_errors_fatal 30529 1726882650.64797: checking for max_fail_percentage 30529 1726882650.64798: done checking for max_fail_percentage 30529 1726882650.64799: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.64799: done checking to see if all hosts have failed 30529 1726882650.64800: getting the remaining hosts for this loop 30529 1726882650.64801: done getting the remaining hosts for this loop 30529 1726882650.64802: getting the next task for host managed_node1 30529 1726882650.64805: done getting next task for host managed_node1 30529 1726882650.64807: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882650.64809: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.64819: getting variables 30529 1726882650.64820: in VariableManager get_vars() 30529 1726882650.64827: Calling all_inventory to load vars for managed_node1 30529 1726882650.64828: Calling groups_inventory to load vars for managed_node1 30529 1726882650.64830: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.64834: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.64835: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.64837: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.65506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.66334: done with get_vars() 30529 1726882650.66348: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:37:30 -0400 (0:00:00.060) 0:01:04.690 ****** 30529 1726882650.66400: entering _queue_task() for managed_node1/include_tasks 30529 1726882650.66662: worker is 1 (out of 1 available) 30529 1726882650.66676: exiting _queue_task() for managed_node1/include_tasks 30529 1726882650.66688: done queuing things up, now waiting for results queue to drain 30529 1726882650.66690: waiting for pending results... 30529 1726882650.66879: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882650.66956: in run() - task 12673a56-9f93-b0f1-edc0-0000000015cf 30529 1726882650.66966: variable 'ansible_search_path' from source: unknown 30529 1726882650.66970: variable 'ansible_search_path' from source: unknown 30529 1726882650.67010: calling self._execute() 30529 1726882650.67080: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.67085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.67097: variable 'omit' from source: magic vars 30529 1726882650.67365: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.67375: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.67381: _execute() done 30529 1726882650.67384: dumping result to json 30529 1726882650.67387: done dumping result, returning 30529 1726882650.67397: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-0000000015cf] 30529 1726882650.67403: sending task result for task 12673a56-9f93-b0f1-edc0-0000000015cf 30529 1726882650.67480: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000015cf 30529 1726882650.67483: WORKER PROCESS EXITING 30529 1726882650.67511: no more pending results, returning what we have 30529 1726882650.67516: in VariableManager get_vars() 30529 1726882650.67552: Calling all_inventory to load vars for managed_node1 30529 1726882650.67555: Calling groups_inventory to load vars for managed_node1 30529 1726882650.67558: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.67571: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.67574: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.67577: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.68350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.69191: done with get_vars() 30529 1726882650.69205: variable 'ansible_search_path' from source: unknown 30529 1726882650.69207: variable 'ansible_search_path' from source: unknown 30529 1726882650.69213: variable 'item' from source: include params 30529 1726882650.69296: variable 'item' from source: include params 30529 1726882650.69319: we have included files to process 30529 1726882650.69320: generating all_blocks data 30529 1726882650.69322: done generating all_blocks data 30529 1726882650.69323: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882650.69324: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882650.69326: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882650.69444: done processing included file 30529 1726882650.69446: iterating over new_blocks loaded from include file 30529 1726882650.69447: in VariableManager get_vars() 30529 1726882650.69458: done with get_vars() 30529 1726882650.69459: filtering new block on tags 30529 1726882650.69475: done filtering new block on tags 30529 1726882650.69476: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882650.69480: extending task lists for all hosts with included blocks 30529 1726882650.69571: done extending task lists 30529 1726882650.69572: done processing included files 30529 1726882650.69573: results queue empty 30529 1726882650.69573: checking for any_errors_fatal 30529 1726882650.69576: done checking for any_errors_fatal 30529 1726882650.69577: checking for max_fail_percentage 30529 1726882650.69577: done checking for max_fail_percentage 30529 1726882650.69578: checking to see if all hosts have failed and the running result is not ok 30529 1726882650.69578: done checking to see if all hosts have failed 30529 1726882650.69579: getting the remaining hosts for this loop 30529 1726882650.69580: done getting the remaining hosts for this loop 30529 1726882650.69581: getting the next task for host managed_node1 30529 1726882650.69584: done getting next task for host managed_node1 30529 1726882650.69586: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882650.69588: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882650.69589: getting variables 30529 1726882650.69590: in VariableManager get_vars() 30529 1726882650.69598: Calling all_inventory to load vars for managed_node1 30529 1726882650.69600: Calling groups_inventory to load vars for managed_node1 30529 1726882650.69601: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882650.69605: Calling all_plugins_play to load vars for managed_node1 30529 1726882650.69606: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882650.69607: Calling groups_plugins_play to load vars for managed_node1 30529 1726882650.73895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882650.74735: done with get_vars() 30529 1726882650.74751: done getting variables 30529 1726882650.74845: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:37:30 -0400 (0:00:00.084) 0:01:04.774 ****** 30529 1726882650.74865: entering _queue_task() for managed_node1/stat 30529 1726882650.75141: worker is 1 (out of 1 available) 30529 1726882650.75154: exiting _queue_task() for managed_node1/stat 30529 1726882650.75167: done queuing things up, now waiting for results queue to drain 30529 1726882650.75170: waiting for pending results... 30529 1726882650.75357: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882650.75460: in run() - task 12673a56-9f93-b0f1-edc0-000000001647 30529 1726882650.75472: variable 'ansible_search_path' from source: unknown 30529 1726882650.75476: variable 'ansible_search_path' from source: unknown 30529 1726882650.75515: calling self._execute() 30529 1726882650.75579: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.75583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.75591: variable 'omit' from source: magic vars 30529 1726882650.75868: variable 'ansible_distribution_major_version' from source: facts 30529 1726882650.75878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882650.75884: variable 'omit' from source: magic vars 30529 1726882650.75932: variable 'omit' from source: magic vars 30529 1726882650.76005: variable 'interface' from source: play vars 30529 1726882650.76021: variable 'omit' from source: magic vars 30529 1726882650.76056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882650.76087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882650.76108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882650.76122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.76134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882650.76157: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882650.76161: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.76163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.76240: Set connection var ansible_shell_executable to /bin/sh 30529 1726882650.76244: Set connection var ansible_pipelining to False 30529 1726882650.76246: Set connection var ansible_shell_type to sh 30529 1726882650.76254: Set connection var ansible_timeout to 10 30529 1726882650.76257: Set connection var ansible_connection to ssh 30529 1726882650.76262: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882650.76278: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.76288: variable 'ansible_connection' from source: unknown 30529 1726882650.76290: variable 'ansible_module_compression' from source: unknown 30529 1726882650.76297: variable 'ansible_shell_type' from source: unknown 30529 1726882650.76300: variable 'ansible_shell_executable' from source: unknown 30529 1726882650.76302: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882650.76307: variable 'ansible_pipelining' from source: unknown 30529 1726882650.76309: variable 'ansible_timeout' from source: unknown 30529 1726882650.76312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882650.76460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882650.76469: variable 'omit' from source: magic vars 30529 1726882650.76474: starting attempt loop 30529 1726882650.76477: running the handler 30529 1726882650.76491: _low_level_execute_command(): starting 30529 1726882650.76504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882650.77009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.77013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.77018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.77070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.77073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.77075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.77125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.78964: stdout chunk (state=3): >>>/root <<< 30529 1726882650.78990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.79007: stdout chunk (state=3): >>><<< 30529 1726882650.79095: stderr chunk (state=3): >>><<< 30529 1726882650.79109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.79113: _low_level_execute_command(): starting 30529 1726882650.79116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733 `" && echo ansible-tmp-1726882650.790451-33605-268697506974733="` echo /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733 `" ) && sleep 0' 30529 1726882650.79788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.79806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882650.79873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.79926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.79948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.80000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.80078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.81938: stdout chunk (state=3): >>>ansible-tmp-1726882650.790451-33605-268697506974733=/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733 <<< 30529 1726882650.82047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.82071: stderr chunk (state=3): >>><<< 30529 1726882650.82075: stdout chunk (state=3): >>><<< 30529 1726882650.82098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882650.790451-33605-268697506974733=/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.82136: variable 'ansible_module_compression' from source: unknown 30529 1726882650.82184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882650.82217: variable 'ansible_facts' from source: unknown 30529 1726882650.82278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py 30529 1726882650.82374: Sending initial data 30529 1726882650.82377: Sent initial data (152 bytes) 30529 1726882650.82814: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.82819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882650.82822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.82825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882650.82828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.82877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.82882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882650.82884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.82922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.84425: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882650.84432: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882650.84463: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882650.84512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpz13phabw /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py <<< 30529 1726882650.84519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py" <<< 30529 1726882650.84560: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpz13phabw" to remote "/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py" <<< 30529 1726882650.85074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.85114: stderr chunk (state=3): >>><<< 30529 1726882650.85117: stdout chunk (state=3): >>><<< 30529 1726882650.85142: done transferring module to remote 30529 1726882650.85151: _low_level_execute_command(): starting 30529 1726882650.85155: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/ /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py && sleep 0' 30529 1726882650.85582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882650.85585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882650.85588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.85594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.85601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.85646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.85649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.85698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882650.87400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882650.87423: stderr chunk (state=3): >>><<< 30529 1726882650.87426: stdout chunk (state=3): >>><<< 30529 1726882650.87440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882650.87443: _low_level_execute_command(): starting 30529 1726882650.87445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/AnsiballZ_stat.py && sleep 0' 30529 1726882650.87849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882650.87854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.87865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882650.87924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882650.87933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882650.87971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.03161: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30826, "dev": 23, "nlink": 1, "atime": 1726882637.92835, "mtime": 1726882637.92835, "ctime": 1726882637.92835, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882651.04346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.04453: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882651.04456: stdout chunk (state=3): >>><<< 30529 1726882651.04459: stderr chunk (state=3): >>><<< 30529 1726882651.04479: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30826, "dev": 23, "nlink": 1, "atime": 1726882637.92835, "mtime": 1726882637.92835, "ctime": 1726882637.92835, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882651.04600: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882651.04607: _low_level_execute_command(): starting 30529 1726882651.04611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882650.790451-33605-268697506974733/ > /dev/null 2>&1 && sleep 0' 30529 1726882651.05213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.05230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.05244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.05264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882651.05316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.05386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.05441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.05479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.07320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.07699: stderr chunk (state=3): >>><<< 30529 1726882651.07703: stdout chunk (state=3): >>><<< 30529 1726882651.07706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.07708: handler run complete 30529 1726882651.07710: attempt loop complete, returning result 30529 1726882651.07712: _execute() done 30529 1726882651.07714: dumping result to json 30529 1726882651.07717: done dumping result, returning 30529 1726882651.07719: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000001647] 30529 1726882651.07721: sending task result for task 12673a56-9f93-b0f1-edc0-000000001647 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882637.92835, "block_size": 4096, "blocks": 0, "ctime": 1726882637.92835, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30826, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882637.92835, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30529 1726882651.07927: no more pending results, returning what we have 30529 1726882651.07931: results queue empty 30529 1726882651.07932: checking for any_errors_fatal 30529 1726882651.07934: done checking for any_errors_fatal 30529 1726882651.07934: checking for max_fail_percentage 30529 1726882651.07936: done checking for max_fail_percentage 30529 1726882651.07937: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.07938: done checking to see if all hosts have failed 30529 1726882651.07938: getting the remaining hosts for this loop 30529 1726882651.07940: done getting the remaining hosts for this loop 30529 1726882651.07944: getting the next task for host managed_node1 30529 1726882651.07953: done getting next task for host managed_node1 30529 1726882651.07956: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30529 1726882651.07959: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.07964: getting variables 30529 1726882651.07966: in VariableManager get_vars() 30529 1726882651.08413: Calling all_inventory to load vars for managed_node1 30529 1726882651.08417: Calling groups_inventory to load vars for managed_node1 30529 1726882651.08421: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.08435: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.08439: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.08442: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.09126: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001647 30529 1726882651.09130: WORKER PROCESS EXITING 30529 1726882651.10769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.12309: done with get_vars() 30529 1726882651.12332: done getting variables 30529 1726882651.12392: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882651.12515: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:37:31 -0400 (0:00:00.376) 0:01:05.151 ****** 30529 1726882651.12553: entering _queue_task() for managed_node1/assert 30529 1726882651.12938: worker is 1 (out of 1 available) 30529 1726882651.12950: exiting _queue_task() for managed_node1/assert 30529 1726882651.12964: done queuing things up, now waiting for results queue to drain 30529 1726882651.12965: waiting for pending results... 30529 1726882651.13571: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' 30529 1726882651.14010: in run() - task 12673a56-9f93-b0f1-edc0-0000000015d0 30529 1726882651.14014: variable 'ansible_search_path' from source: unknown 30529 1726882651.14017: variable 'ansible_search_path' from source: unknown 30529 1726882651.14020: calling self._execute() 30529 1726882651.14071: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.14125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.14238: variable 'omit' from source: magic vars 30529 1726882651.14847: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.15004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.15015: variable 'omit' from source: magic vars 30529 1726882651.15064: variable 'omit' from source: magic vars 30529 1726882651.15183: variable 'interface' from source: play vars 30529 1726882651.15348: variable 'omit' from source: magic vars 30529 1726882651.15402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882651.15506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882651.15614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882651.15658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.15709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.15781: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882651.15822: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.15866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.16101: Set connection var ansible_shell_executable to /bin/sh 30529 1726882651.16300: Set connection var ansible_pipelining to False 30529 1726882651.16303: Set connection var ansible_shell_type to sh 30529 1726882651.16305: Set connection var ansible_timeout to 10 30529 1726882651.16307: Set connection var ansible_connection to ssh 30529 1726882651.16310: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882651.16312: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.16314: variable 'ansible_connection' from source: unknown 30529 1726882651.16316: variable 'ansible_module_compression' from source: unknown 30529 1726882651.16318: variable 'ansible_shell_type' from source: unknown 30529 1726882651.16320: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.16322: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.16324: variable 'ansible_pipelining' from source: unknown 30529 1726882651.16326: variable 'ansible_timeout' from source: unknown 30529 1726882651.16328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.16808: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882651.16812: variable 'omit' from source: magic vars 30529 1726882651.16814: starting attempt loop 30529 1726882651.16818: running the handler 30529 1726882651.17057: variable 'interface_stat' from source: set_fact 30529 1726882651.17082: Evaluated conditional (interface_stat.stat.exists): True 30529 1726882651.17118: handler run complete 30529 1726882651.17168: attempt loop complete, returning result 30529 1726882651.17176: _execute() done 30529 1726882651.17202: dumping result to json 30529 1726882651.17211: done dumping result, returning 30529 1726882651.17225: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'statebr' [12673a56-9f93-b0f1-edc0-0000000015d0] 30529 1726882651.17253: sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d0 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882651.17745: no more pending results, returning what we have 30529 1726882651.17749: results queue empty 30529 1726882651.17751: checking for any_errors_fatal 30529 1726882651.17760: done checking for any_errors_fatal 30529 1726882651.17761: checking for max_fail_percentage 30529 1726882651.17762: done checking for max_fail_percentage 30529 1726882651.17763: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.17764: done checking to see if all hosts have failed 30529 1726882651.17765: getting the remaining hosts for this loop 30529 1726882651.17767: done getting the remaining hosts for this loop 30529 1726882651.17772: getting the next task for host managed_node1 30529 1726882651.17783: done getting next task for host managed_node1 30529 1726882651.17786: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882651.17795: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.17800: getting variables 30529 1726882651.17802: in VariableManager get_vars() 30529 1726882651.17842: Calling all_inventory to load vars for managed_node1 30529 1726882651.17845: Calling groups_inventory to load vars for managed_node1 30529 1726882651.17849: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.17861: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.17864: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.17867: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.18514: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d0 30529 1726882651.18517: WORKER PROCESS EXITING 30529 1726882651.19633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.21916: done with get_vars() 30529 1726882651.21948: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:37:31 -0400 (0:00:00.094) 0:01:05.246 ****** 30529 1726882651.22047: entering _queue_task() for managed_node1/include_tasks 30529 1726882651.22404: worker is 1 (out of 1 available) 30529 1726882651.22418: exiting _queue_task() for managed_node1/include_tasks 30529 1726882651.22433: done queuing things up, now waiting for results queue to drain 30529 1726882651.22435: waiting for pending results... 30529 1726882651.22835: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882651.23039: in run() - task 12673a56-9f93-b0f1-edc0-0000000015d4 30529 1726882651.23043: variable 'ansible_search_path' from source: unknown 30529 1726882651.23046: variable 'ansible_search_path' from source: unknown 30529 1726882651.23053: calling self._execute() 30529 1726882651.23178: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.23182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.23196: variable 'omit' from source: magic vars 30529 1726882651.24198: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.24202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.24204: _execute() done 30529 1726882651.24207: dumping result to json 30529 1726882651.24208: done dumping result, returning 30529 1726882651.24210: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-0000000015d4] 30529 1726882651.24212: sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d4 30529 1726882651.24272: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d4 30529 1726882651.24274: WORKER PROCESS EXITING 30529 1726882651.24298: no more pending results, returning what we have 30529 1726882651.24302: in VariableManager get_vars() 30529 1726882651.24338: Calling all_inventory to load vars for managed_node1 30529 1726882651.24341: Calling groups_inventory to load vars for managed_node1 30529 1726882651.24344: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.24353: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.24356: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.24359: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.26814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.28860: done with get_vars() 30529 1726882651.28882: variable 'ansible_search_path' from source: unknown 30529 1726882651.28883: variable 'ansible_search_path' from source: unknown 30529 1726882651.28894: variable 'item' from source: include params 30529 1726882651.29200: variable 'item' from source: include params 30529 1726882651.29235: we have included files to process 30529 1726882651.29236: generating all_blocks data 30529 1726882651.29239: done generating all_blocks data 30529 1726882651.29245: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882651.29246: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882651.29248: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882651.30714: done processing included file 30529 1726882651.30716: iterating over new_blocks loaded from include file 30529 1726882651.30717: in VariableManager get_vars() 30529 1726882651.30735: done with get_vars() 30529 1726882651.30737: filtering new block on tags 30529 1726882651.31048: done filtering new block on tags 30529 1726882651.31052: in VariableManager get_vars() 30529 1726882651.31069: done with get_vars() 30529 1726882651.31071: filtering new block on tags 30529 1726882651.31335: done filtering new block on tags 30529 1726882651.31337: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882651.31343: extending task lists for all hosts with included blocks 30529 1726882651.31699: done extending task lists 30529 1726882651.31701: done processing included files 30529 1726882651.31702: results queue empty 30529 1726882651.31703: checking for any_errors_fatal 30529 1726882651.31707: done checking for any_errors_fatal 30529 1726882651.31708: checking for max_fail_percentage 30529 1726882651.31709: done checking for max_fail_percentage 30529 1726882651.31709: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.31710: done checking to see if all hosts have failed 30529 1726882651.31711: getting the remaining hosts for this loop 30529 1726882651.31713: done getting the remaining hosts for this loop 30529 1726882651.31715: getting the next task for host managed_node1 30529 1726882651.31720: done getting next task for host managed_node1 30529 1726882651.31722: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882651.31726: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.31728: getting variables 30529 1726882651.31729: in VariableManager get_vars() 30529 1726882651.31739: Calling all_inventory to load vars for managed_node1 30529 1726882651.31741: Calling groups_inventory to load vars for managed_node1 30529 1726882651.31744: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.31750: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.31752: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.31755: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.33296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.35597: done with get_vars() 30529 1726882651.35619: done getting variables 30529 1726882651.35664: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:37:31 -0400 (0:00:00.136) 0:01:05.383 ****** 30529 1726882651.35851: entering _queue_task() for managed_node1/set_fact 30529 1726882651.36275: worker is 1 (out of 1 available) 30529 1726882651.36290: exiting _queue_task() for managed_node1/set_fact 30529 1726882651.36502: done queuing things up, now waiting for results queue to drain 30529 1726882651.36504: waiting for pending results... 30529 1726882651.36596: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882651.36730: in run() - task 12673a56-9f93-b0f1-edc0-000000001665 30529 1726882651.36754: variable 'ansible_search_path' from source: unknown 30529 1726882651.36763: variable 'ansible_search_path' from source: unknown 30529 1726882651.36807: calling self._execute() 30529 1726882651.36907: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.36919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.36935: variable 'omit' from source: magic vars 30529 1726882651.37277: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.37289: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.37299: variable 'omit' from source: magic vars 30529 1726882651.37336: variable 'omit' from source: magic vars 30529 1726882651.37359: variable 'omit' from source: magic vars 30529 1726882651.37394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882651.37425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882651.37445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882651.37458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.37469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.37498: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882651.37501: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.37504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.37573: Set connection var ansible_shell_executable to /bin/sh 30529 1726882651.37576: Set connection var ansible_pipelining to False 30529 1726882651.37578: Set connection var ansible_shell_type to sh 30529 1726882651.37588: Set connection var ansible_timeout to 10 30529 1726882651.37592: Set connection var ansible_connection to ssh 30529 1726882651.37600: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882651.37617: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.37620: variable 'ansible_connection' from source: unknown 30529 1726882651.37624: variable 'ansible_module_compression' from source: unknown 30529 1726882651.37627: variable 'ansible_shell_type' from source: unknown 30529 1726882651.37630: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.37632: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.37635: variable 'ansible_pipelining' from source: unknown 30529 1726882651.37637: variable 'ansible_timeout' from source: unknown 30529 1726882651.37639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.37742: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882651.37753: variable 'omit' from source: magic vars 30529 1726882651.37756: starting attempt loop 30529 1726882651.37759: running the handler 30529 1726882651.37770: handler run complete 30529 1726882651.37778: attempt loop complete, returning result 30529 1726882651.37781: _execute() done 30529 1726882651.37783: dumping result to json 30529 1726882651.37786: done dumping result, returning 30529 1726882651.37797: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-000000001665] 30529 1726882651.37803: sending task result for task 12673a56-9f93-b0f1-edc0-000000001665 30529 1726882651.37883: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001665 30529 1726882651.37886: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882651.37967: no more pending results, returning what we have 30529 1726882651.37971: results queue empty 30529 1726882651.37972: checking for any_errors_fatal 30529 1726882651.37973: done checking for any_errors_fatal 30529 1726882651.37974: checking for max_fail_percentage 30529 1726882651.37976: done checking for max_fail_percentage 30529 1726882651.37976: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.37977: done checking to see if all hosts have failed 30529 1726882651.37978: getting the remaining hosts for this loop 30529 1726882651.37980: done getting the remaining hosts for this loop 30529 1726882651.37983: getting the next task for host managed_node1 30529 1726882651.37990: done getting next task for host managed_node1 30529 1726882651.37994: ^ task is: TASK: Stat profile file 30529 1726882651.37999: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.38003: getting variables 30529 1726882651.38005: in VariableManager get_vars() 30529 1726882651.38041: Calling all_inventory to load vars for managed_node1 30529 1726882651.38043: Calling groups_inventory to load vars for managed_node1 30529 1726882651.38046: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.38056: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.38059: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.38061: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.38920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.40224: done with get_vars() 30529 1726882651.40242: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:37:31 -0400 (0:00:00.046) 0:01:05.429 ****** 30529 1726882651.40318: entering _queue_task() for managed_node1/stat 30529 1726882651.40571: worker is 1 (out of 1 available) 30529 1726882651.40584: exiting _queue_task() for managed_node1/stat 30529 1726882651.40598: done queuing things up, now waiting for results queue to drain 30529 1726882651.40600: waiting for pending results... 30529 1726882651.40779: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882651.40870: in run() - task 12673a56-9f93-b0f1-edc0-000000001666 30529 1726882651.40880: variable 'ansible_search_path' from source: unknown 30529 1726882651.40884: variable 'ansible_search_path' from source: unknown 30529 1726882651.40915: calling self._execute() 30529 1726882651.40987: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.40990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.41004: variable 'omit' from source: magic vars 30529 1726882651.41280: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.41291: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.41302: variable 'omit' from source: magic vars 30529 1726882651.41340: variable 'omit' from source: magic vars 30529 1726882651.41414: variable 'profile' from source: play vars 30529 1726882651.41418: variable 'interface' from source: play vars 30529 1726882651.41461: variable 'interface' from source: play vars 30529 1726882651.41476: variable 'omit' from source: magic vars 30529 1726882651.41512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882651.41540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882651.41557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882651.41569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.41580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.41611: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882651.41614: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.41617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.41684: Set connection var ansible_shell_executable to /bin/sh 30529 1726882651.41687: Set connection var ansible_pipelining to False 30529 1726882651.41689: Set connection var ansible_shell_type to sh 30529 1726882651.41705: Set connection var ansible_timeout to 10 30529 1726882651.41707: Set connection var ansible_connection to ssh 30529 1726882651.41710: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882651.41727: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.41730: variable 'ansible_connection' from source: unknown 30529 1726882651.41733: variable 'ansible_module_compression' from source: unknown 30529 1726882651.41735: variable 'ansible_shell_type' from source: unknown 30529 1726882651.41737: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.41739: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.41741: variable 'ansible_pipelining' from source: unknown 30529 1726882651.41744: variable 'ansible_timeout' from source: unknown 30529 1726882651.41749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.41968: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882651.41972: variable 'omit' from source: magic vars 30529 1726882651.41975: starting attempt loop 30529 1726882651.41977: running the handler 30529 1726882651.41979: _low_level_execute_command(): starting 30529 1726882651.41981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882651.42699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.42703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.42706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.42709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882651.42713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882651.42716: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882651.42718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.42722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882651.42731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882651.42738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882651.42746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.42756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.42803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882651.42806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882651.42808: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882651.42810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.42866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.42882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.42899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.42978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.44615: stdout chunk (state=3): >>>/root <<< 30529 1726882651.44776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.44780: stdout chunk (state=3): >>><<< 30529 1726882651.44782: stderr chunk (state=3): >>><<< 30529 1726882651.44809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.44925: _low_level_execute_command(): starting 30529 1726882651.44932: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853 `" && echo ansible-tmp-1726882651.4482403-33642-217104499925853="` echo /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853 `" ) && sleep 0' 30529 1726882651.45501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.45516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.45561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.45564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882651.45662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.45678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.45699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.45787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.47648: stdout chunk (state=3): >>>ansible-tmp-1726882651.4482403-33642-217104499925853=/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853 <<< 30529 1726882651.47789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.47801: stdout chunk (state=3): >>><<< 30529 1726882651.47885: stderr chunk (state=3): >>><<< 30529 1726882651.47890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882651.4482403-33642-217104499925853=/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.47895: variable 'ansible_module_compression' from source: unknown 30529 1726882651.47961: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882651.48016: variable 'ansible_facts' from source: unknown 30529 1726882651.48126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py 30529 1726882651.48320: Sending initial data 30529 1726882651.48324: Sent initial data (153 bytes) 30529 1726882651.48903: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.48913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.49007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.49021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.49032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.49074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.49120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.50617: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882651.50628: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882651.50639: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882651.50677: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882651.50724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882651.50776: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcbaim_m3 /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py <<< 30529 1726882651.50781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py" <<< 30529 1726882651.50828: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpcbaim_m3" to remote "/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py" <<< 30529 1726882651.51581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.51716: stderr chunk (state=3): >>><<< 30529 1726882651.51719: stdout chunk (state=3): >>><<< 30529 1726882651.51721: done transferring module to remote 30529 1726882651.51724: _low_level_execute_command(): starting 30529 1726882651.51726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/ /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py && sleep 0' 30529 1726882651.52361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.52403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.52409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882651.52412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.52414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.52417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.52479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.52483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.52528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.54236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.54266: stderr chunk (state=3): >>><<< 30529 1726882651.54269: stdout chunk (state=3): >>><<< 30529 1726882651.54350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.54353: _low_level_execute_command(): starting 30529 1726882651.54355: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/AnsiballZ_stat.py && sleep 0' 30529 1726882651.54869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.54891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.54945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882651.54948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882651.54950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882651.54952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.54954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.54956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882651.54963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.55009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.55026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.55077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.70092: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882651.71296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882651.71323: stderr chunk (state=3): >>><<< 30529 1726882651.71326: stdout chunk (state=3): >>><<< 30529 1726882651.71347: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882651.71371: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882651.71381: _low_level_execute_command(): starting 30529 1726882651.71385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882651.4482403-33642-217104499925853/ > /dev/null 2>&1 && sleep 0' 30529 1726882651.71840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.71848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.71851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882651.71853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882651.71855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.71910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.71913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.71950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.73772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.73798: stderr chunk (state=3): >>><<< 30529 1726882651.73803: stdout chunk (state=3): >>><<< 30529 1726882651.73822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.73828: handler run complete 30529 1726882651.73843: attempt loop complete, returning result 30529 1726882651.73846: _execute() done 30529 1726882651.73848: dumping result to json 30529 1726882651.73851: done dumping result, returning 30529 1726882651.73858: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-000000001666] 30529 1726882651.73862: sending task result for task 12673a56-9f93-b0f1-edc0-000000001666 30529 1726882651.73958: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001666 30529 1726882651.73961: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882651.74019: no more pending results, returning what we have 30529 1726882651.74023: results queue empty 30529 1726882651.74024: checking for any_errors_fatal 30529 1726882651.74031: done checking for any_errors_fatal 30529 1726882651.74032: checking for max_fail_percentage 30529 1726882651.74033: done checking for max_fail_percentage 30529 1726882651.74034: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.74035: done checking to see if all hosts have failed 30529 1726882651.74036: getting the remaining hosts for this loop 30529 1726882651.74037: done getting the remaining hosts for this loop 30529 1726882651.74041: getting the next task for host managed_node1 30529 1726882651.74049: done getting next task for host managed_node1 30529 1726882651.74051: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882651.74057: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.74061: getting variables 30529 1726882651.74062: in VariableManager get_vars() 30529 1726882651.74106: Calling all_inventory to load vars for managed_node1 30529 1726882651.74109: Calling groups_inventory to load vars for managed_node1 30529 1726882651.74112: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.74123: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.74126: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.74129: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.75113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.76443: done with get_vars() 30529 1726882651.76460: done getting variables 30529 1726882651.76509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:37:31 -0400 (0:00:00.362) 0:01:05.791 ****** 30529 1726882651.76533: entering _queue_task() for managed_node1/set_fact 30529 1726882651.76785: worker is 1 (out of 1 available) 30529 1726882651.76800: exiting _queue_task() for managed_node1/set_fact 30529 1726882651.76812: done queuing things up, now waiting for results queue to drain 30529 1726882651.76814: waiting for pending results... 30529 1726882651.76998: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882651.77085: in run() - task 12673a56-9f93-b0f1-edc0-000000001667 30529 1726882651.77102: variable 'ansible_search_path' from source: unknown 30529 1726882651.77106: variable 'ansible_search_path' from source: unknown 30529 1726882651.77133: calling self._execute() 30529 1726882651.77211: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.77215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.77224: variable 'omit' from source: magic vars 30529 1726882651.77512: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.77522: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.77610: variable 'profile_stat' from source: set_fact 30529 1726882651.77620: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882651.77623: when evaluation is False, skipping this task 30529 1726882651.77626: _execute() done 30529 1726882651.77629: dumping result to json 30529 1726882651.77631: done dumping result, returning 30529 1726882651.77638: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-000000001667] 30529 1726882651.77642: sending task result for task 12673a56-9f93-b0f1-edc0-000000001667 30529 1726882651.77725: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001667 30529 1726882651.77728: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882651.77771: no more pending results, returning what we have 30529 1726882651.77778: results queue empty 30529 1726882651.77780: checking for any_errors_fatal 30529 1726882651.77795: done checking for any_errors_fatal 30529 1726882651.77795: checking for max_fail_percentage 30529 1726882651.77797: done checking for max_fail_percentage 30529 1726882651.77798: checking to see if all hosts have failed and the running result is not ok 30529 1726882651.77799: done checking to see if all hosts have failed 30529 1726882651.77800: getting the remaining hosts for this loop 30529 1726882651.77802: done getting the remaining hosts for this loop 30529 1726882651.77805: getting the next task for host managed_node1 30529 1726882651.77813: done getting next task for host managed_node1 30529 1726882651.77816: ^ task is: TASK: Get NM profile info 30529 1726882651.77820: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882651.77825: getting variables 30529 1726882651.77827: in VariableManager get_vars() 30529 1726882651.77862: Calling all_inventory to load vars for managed_node1 30529 1726882651.77864: Calling groups_inventory to load vars for managed_node1 30529 1726882651.77868: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882651.77884: Calling all_plugins_play to load vars for managed_node1 30529 1726882651.77887: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882651.77889: Calling groups_plugins_play to load vars for managed_node1 30529 1726882651.79082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882651.80822: done with get_vars() 30529 1726882651.80846: done getting variables 30529 1726882651.80908: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:37:31 -0400 (0:00:00.044) 0:01:05.835 ****** 30529 1726882651.80942: entering _queue_task() for managed_node1/shell 30529 1726882651.81722: worker is 1 (out of 1 available) 30529 1726882651.81735: exiting _queue_task() for managed_node1/shell 30529 1726882651.81749: done queuing things up, now waiting for results queue to drain 30529 1726882651.81751: waiting for pending results... 30529 1726882651.82714: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882651.83001: in run() - task 12673a56-9f93-b0f1-edc0-000000001668 30529 1726882651.83005: variable 'ansible_search_path' from source: unknown 30529 1726882651.83008: variable 'ansible_search_path' from source: unknown 30529 1726882651.83037: calling self._execute() 30529 1726882651.83261: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.83265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.83278: variable 'omit' from source: magic vars 30529 1726882651.84331: variable 'ansible_distribution_major_version' from source: facts 30529 1726882651.84344: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882651.84350: variable 'omit' from source: magic vars 30529 1726882651.84525: variable 'omit' from source: magic vars 30529 1726882651.84876: variable 'profile' from source: play vars 30529 1726882651.84880: variable 'interface' from source: play vars 30529 1726882651.84947: variable 'interface' from source: play vars 30529 1726882651.85191: variable 'omit' from source: magic vars 30529 1726882651.85235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882651.85271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882651.85633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882651.85636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.85638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882651.85641: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882651.85643: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.85645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.86069: Set connection var ansible_shell_executable to /bin/sh 30529 1726882651.86073: Set connection var ansible_pipelining to False 30529 1726882651.86075: Set connection var ansible_shell_type to sh 30529 1726882651.86077: Set connection var ansible_timeout to 10 30529 1726882651.86079: Set connection var ansible_connection to ssh 30529 1726882651.86082: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882651.86084: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.86087: variable 'ansible_connection' from source: unknown 30529 1726882651.86089: variable 'ansible_module_compression' from source: unknown 30529 1726882651.86090: variable 'ansible_shell_type' from source: unknown 30529 1726882651.86092: variable 'ansible_shell_executable' from source: unknown 30529 1726882651.86097: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882651.86124: variable 'ansible_pipelining' from source: unknown 30529 1726882651.86128: variable 'ansible_timeout' from source: unknown 30529 1726882651.86130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882651.86408: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882651.86420: variable 'omit' from source: magic vars 30529 1726882651.86598: starting attempt loop 30529 1726882651.86601: running the handler 30529 1726882651.86605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882651.86608: _low_level_execute_command(): starting 30529 1726882651.86610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882651.88181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882651.88291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.88401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.88429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.88510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.90142: stdout chunk (state=3): >>>/root <<< 30529 1726882651.90347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.90350: stdout chunk (state=3): >>><<< 30529 1726882651.90353: stderr chunk (state=3): >>><<< 30529 1726882651.90378: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.90403: _low_level_execute_command(): starting 30529 1726882651.90416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792 `" && echo ansible-tmp-1726882651.9038546-33659-271273027451792="` echo /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792 `" ) && sleep 0' 30529 1726882651.91586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.91667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.91787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882651.91802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882651.91883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882651.91991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.93769: stdout chunk (state=3): >>>ansible-tmp-1726882651.9038546-33659-271273027451792=/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792 <<< 30529 1726882651.93876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.94007: stderr chunk (state=3): >>><<< 30529 1726882651.94099: stdout chunk (state=3): >>><<< 30529 1726882651.94103: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882651.9038546-33659-271273027451792=/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882651.94184: variable 'ansible_module_compression' from source: unknown 30529 1726882651.94401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882651.94404: variable 'ansible_facts' from source: unknown 30529 1726882651.94483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py 30529 1726882651.94919: Sending initial data 30529 1726882651.94922: Sent initial data (156 bytes) 30529 1726882651.96277: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882651.96281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882651.96283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882651.96508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882651.98059: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882651.98111: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882651.98174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppdg31aww /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py <<< 30529 1726882651.98188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py" <<< 30529 1726882651.98314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmppdg31aww" to remote "/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py" <<< 30529 1726882651.99533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882651.99594: stderr chunk (state=3): >>><<< 30529 1726882651.99605: stdout chunk (state=3): >>><<< 30529 1726882651.99632: done transferring module to remote 30529 1726882651.99883: _low_level_execute_command(): starting 30529 1726882651.99886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/ /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py && sleep 0' 30529 1726882652.00926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882652.00940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882652.00952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882652.00971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882652.01119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882652.01152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882652.01256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882652.03311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882652.03315: stdout chunk (state=3): >>><<< 30529 1726882652.03317: stderr chunk (state=3): >>><<< 30529 1726882652.03319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882652.03322: _low_level_execute_command(): starting 30529 1726882652.03324: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/AnsiballZ_command.py && sleep 0' 30529 1726882652.04664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882652.04707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882652.04740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882652.04822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882652.21457: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:37:32.197129", "end": "2024-09-20 21:37:32.213488", "delta": "0:00:00.016359", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882652.22849: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882652.22867: stdout chunk (state=3): >>><<< 30529 1726882652.22880: stderr chunk (state=3): >>><<< 30529 1726882652.22909: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:37:32.197129", "end": "2024-09-20 21:37:32.213488", "delta": "0:00:00.016359", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882652.22956: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882652.22980: _low_level_execute_command(): starting 30529 1726882652.22991: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882651.9038546-33659-271273027451792/ > /dev/null 2>&1 && sleep 0' 30529 1726882652.23715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882652.23744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882652.23761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882652.23783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882652.23861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882652.25713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882652.25717: stdout chunk (state=3): >>><<< 30529 1726882652.25720: stderr chunk (state=3): >>><<< 30529 1726882652.25746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882652.25752: handler run complete 30529 1726882652.25778: Evaluated conditional (False): False 30529 1726882652.25788: attempt loop complete, returning result 30529 1726882652.25790: _execute() done 30529 1726882652.25798: dumping result to json 30529 1726882652.25803: done dumping result, returning 30529 1726882652.25812: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-000000001668] 30529 1726882652.25817: sending task result for task 12673a56-9f93-b0f1-edc0-000000001668 fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016359", "end": "2024-09-20 21:37:32.213488", "rc": 1, "start": "2024-09-20 21:37:32.197129" } MSG: non-zero return code ...ignoring 30529 1726882652.25999: no more pending results, returning what we have 30529 1726882652.26005: results queue empty 30529 1726882652.26006: checking for any_errors_fatal 30529 1726882652.26013: done checking for any_errors_fatal 30529 1726882652.26013: checking for max_fail_percentage 30529 1726882652.26016: done checking for max_fail_percentage 30529 1726882652.26017: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.26018: done checking to see if all hosts have failed 30529 1726882652.26019: getting the remaining hosts for this loop 30529 1726882652.26021: done getting the remaining hosts for this loop 30529 1726882652.26026: getting the next task for host managed_node1 30529 1726882652.26035: done getting next task for host managed_node1 30529 1726882652.26038: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882652.26043: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.26048: getting variables 30529 1726882652.26049: in VariableManager get_vars() 30529 1726882652.26088: Calling all_inventory to load vars for managed_node1 30529 1726882652.26207: Calling groups_inventory to load vars for managed_node1 30529 1726882652.26212: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.26225: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.26228: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.26231: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.26834: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001668 30529 1726882652.26839: WORKER PROCESS EXITING 30529 1726882652.28063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.29637: done with get_vars() 30529 1726882652.29664: done getting variables 30529 1726882652.29739: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:37:32 -0400 (0:00:00.488) 0:01:06.323 ****** 30529 1726882652.29777: entering _queue_task() for managed_node1/set_fact 30529 1726882652.30179: worker is 1 (out of 1 available) 30529 1726882652.30400: exiting _queue_task() for managed_node1/set_fact 30529 1726882652.30411: done queuing things up, now waiting for results queue to drain 30529 1726882652.30413: waiting for pending results... 30529 1726882652.30527: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882652.30685: in run() - task 12673a56-9f93-b0f1-edc0-000000001669 30529 1726882652.30714: variable 'ansible_search_path' from source: unknown 30529 1726882652.30728: variable 'ansible_search_path' from source: unknown 30529 1726882652.30774: calling self._execute() 30529 1726882652.30881: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.30898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.30916: variable 'omit' from source: magic vars 30529 1726882652.31314: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.31332: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.31464: variable 'nm_profile_exists' from source: set_fact 30529 1726882652.31480: Evaluated conditional (nm_profile_exists.rc == 0): False 30529 1726882652.31506: when evaluation is False, skipping this task 30529 1726882652.31509: _execute() done 30529 1726882652.31511: dumping result to json 30529 1726882652.31600: done dumping result, returning 30529 1726882652.31603: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-000000001669] 30529 1726882652.31606: sending task result for task 12673a56-9f93-b0f1-edc0-000000001669 skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30529 1726882652.31720: no more pending results, returning what we have 30529 1726882652.31725: results queue empty 30529 1726882652.31726: checking for any_errors_fatal 30529 1726882652.31735: done checking for any_errors_fatal 30529 1726882652.31736: checking for max_fail_percentage 30529 1726882652.31738: done checking for max_fail_percentage 30529 1726882652.31739: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.31740: done checking to see if all hosts have failed 30529 1726882652.31740: getting the remaining hosts for this loop 30529 1726882652.31742: done getting the remaining hosts for this loop 30529 1726882652.31746: getting the next task for host managed_node1 30529 1726882652.31759: done getting next task for host managed_node1 30529 1726882652.31761: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882652.31767: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.31773: getting variables 30529 1726882652.31775: in VariableManager get_vars() 30529 1726882652.31819: Calling all_inventory to load vars for managed_node1 30529 1726882652.31822: Calling groups_inventory to load vars for managed_node1 30529 1726882652.31827: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.31841: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.31845: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.31848: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.32535: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001669 30529 1726882652.32539: WORKER PROCESS EXITING 30529 1726882652.33505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.35851: done with get_vars() 30529 1726882652.35877: done getting variables 30529 1726882652.35944: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.36067: variable 'profile' from source: play vars 30529 1726882652.36071: variable 'interface' from source: play vars 30529 1726882652.36137: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:37:32 -0400 (0:00:00.063) 0:01:06.387 ****** 30529 1726882652.36170: entering _queue_task() for managed_node1/command 30529 1726882652.36542: worker is 1 (out of 1 available) 30529 1726882652.36557: exiting _queue_task() for managed_node1/command 30529 1726882652.36573: done queuing things up, now waiting for results queue to drain 30529 1726882652.36575: waiting for pending results... 30529 1726882652.37046: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882652.37145: in run() - task 12673a56-9f93-b0f1-edc0-00000000166b 30529 1726882652.37159: variable 'ansible_search_path' from source: unknown 30529 1726882652.37163: variable 'ansible_search_path' from source: unknown 30529 1726882652.37197: calling self._execute() 30529 1726882652.37792: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.37798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.37806: variable 'omit' from source: magic vars 30529 1726882652.38567: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.38579: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.38913: variable 'profile_stat' from source: set_fact 30529 1726882652.38925: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882652.38928: when evaluation is False, skipping this task 30529 1726882652.38931: _execute() done 30529 1726882652.38933: dumping result to json 30529 1726882652.38936: done dumping result, returning 30529 1726882652.38943: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000166b] 30529 1726882652.38948: sending task result for task 12673a56-9f93-b0f1-edc0-00000000166b skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882652.39164: no more pending results, returning what we have 30529 1726882652.39168: results queue empty 30529 1726882652.39169: checking for any_errors_fatal 30529 1726882652.39175: done checking for any_errors_fatal 30529 1726882652.39176: checking for max_fail_percentage 30529 1726882652.39178: done checking for max_fail_percentage 30529 1726882652.39179: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.39180: done checking to see if all hosts have failed 30529 1726882652.39180: getting the remaining hosts for this loop 30529 1726882652.39182: done getting the remaining hosts for this loop 30529 1726882652.39186: getting the next task for host managed_node1 30529 1726882652.39199: done getting next task for host managed_node1 30529 1726882652.39202: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882652.39209: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.39214: getting variables 30529 1726882652.39216: in VariableManager get_vars() 30529 1726882652.39266: Calling all_inventory to load vars for managed_node1 30529 1726882652.39269: Calling groups_inventory to load vars for managed_node1 30529 1726882652.39273: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.39292: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.39402: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.39408: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000166b 30529 1726882652.39411: WORKER PROCESS EXITING 30529 1726882652.39416: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.41871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.43483: done with get_vars() 30529 1726882652.43512: done getting variables 30529 1726882652.43573: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.43699: variable 'profile' from source: play vars 30529 1726882652.43703: variable 'interface' from source: play vars 30529 1726882652.43762: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:37:32 -0400 (0:00:00.076) 0:01:06.464 ****** 30529 1726882652.43804: entering _queue_task() for managed_node1/set_fact 30529 1726882652.44177: worker is 1 (out of 1 available) 30529 1726882652.44196: exiting _queue_task() for managed_node1/set_fact 30529 1726882652.44210: done queuing things up, now waiting for results queue to drain 30529 1726882652.44212: waiting for pending results... 30529 1726882652.44515: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882652.45004: in run() - task 12673a56-9f93-b0f1-edc0-00000000166c 30529 1726882652.45007: variable 'ansible_search_path' from source: unknown 30529 1726882652.45010: variable 'ansible_search_path' from source: unknown 30529 1726882652.45013: calling self._execute() 30529 1726882652.45172: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.45230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.45245: variable 'omit' from source: magic vars 30529 1726882652.46113: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.46133: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.46409: variable 'profile_stat' from source: set_fact 30529 1726882652.46430: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882652.46439: when evaluation is False, skipping this task 30529 1726882652.46485: _execute() done 30529 1726882652.46499: dumping result to json 30529 1726882652.46508: done dumping result, returning 30529 1726882652.46522: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000166c] 30529 1726882652.46536: sending task result for task 12673a56-9f93-b0f1-edc0-00000000166c skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882652.46850: no more pending results, returning what we have 30529 1726882652.46854: results queue empty 30529 1726882652.46856: checking for any_errors_fatal 30529 1726882652.46862: done checking for any_errors_fatal 30529 1726882652.46863: checking for max_fail_percentage 30529 1726882652.46864: done checking for max_fail_percentage 30529 1726882652.46865: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.46866: done checking to see if all hosts have failed 30529 1726882652.46867: getting the remaining hosts for this loop 30529 1726882652.46868: done getting the remaining hosts for this loop 30529 1726882652.46872: getting the next task for host managed_node1 30529 1726882652.46880: done getting next task for host managed_node1 30529 1726882652.46882: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882652.46888: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.46896: getting variables 30529 1726882652.46898: in VariableManager get_vars() 30529 1726882652.46934: Calling all_inventory to load vars for managed_node1 30529 1726882652.46937: Calling groups_inventory to load vars for managed_node1 30529 1726882652.46940: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.46953: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.46956: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.46958: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.47906: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000166c 30529 1726882652.47910: WORKER PROCESS EXITING 30529 1726882652.49467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.51442: done with get_vars() 30529 1726882652.51464: done getting variables 30529 1726882652.51527: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.51632: variable 'profile' from source: play vars 30529 1726882652.51636: variable 'interface' from source: play vars 30529 1726882652.51686: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:37:32 -0400 (0:00:00.079) 0:01:06.543 ****** 30529 1726882652.51725: entering _queue_task() for managed_node1/command 30529 1726882652.52068: worker is 1 (out of 1 available) 30529 1726882652.52083: exiting _queue_task() for managed_node1/command 30529 1726882652.52369: done queuing things up, now waiting for results queue to drain 30529 1726882652.52371: waiting for pending results... 30529 1726882652.52511: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882652.52651: in run() - task 12673a56-9f93-b0f1-edc0-00000000166d 30529 1726882652.52665: variable 'ansible_search_path' from source: unknown 30529 1726882652.52677: variable 'ansible_search_path' from source: unknown 30529 1726882652.52716: calling self._execute() 30529 1726882652.52855: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.52858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.52877: variable 'omit' from source: magic vars 30529 1726882652.53305: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.53325: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.53475: variable 'profile_stat' from source: set_fact 30529 1726882652.53484: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882652.53487: when evaluation is False, skipping this task 30529 1726882652.53494: _execute() done 30529 1726882652.53497: dumping result to json 30529 1726882652.53500: done dumping result, returning 30529 1726882652.53503: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000166d] 30529 1726882652.53509: sending task result for task 12673a56-9f93-b0f1-edc0-00000000166d 30529 1726882652.53614: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000166d 30529 1726882652.53617: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882652.53695: no more pending results, returning what we have 30529 1726882652.53700: results queue empty 30529 1726882652.53701: checking for any_errors_fatal 30529 1726882652.53711: done checking for any_errors_fatal 30529 1726882652.53712: checking for max_fail_percentage 30529 1726882652.53714: done checking for max_fail_percentage 30529 1726882652.53715: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.53716: done checking to see if all hosts have failed 30529 1726882652.53717: getting the remaining hosts for this loop 30529 1726882652.53719: done getting the remaining hosts for this loop 30529 1726882652.53723: getting the next task for host managed_node1 30529 1726882652.53732: done getting next task for host managed_node1 30529 1726882652.53735: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882652.53740: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.53744: getting variables 30529 1726882652.53746: in VariableManager get_vars() 30529 1726882652.53787: Calling all_inventory to load vars for managed_node1 30529 1726882652.53790: Calling groups_inventory to load vars for managed_node1 30529 1726882652.53900: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.53921: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.53926: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.53930: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.55760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.58108: done with get_vars() 30529 1726882652.58135: done getting variables 30529 1726882652.58215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.58333: variable 'profile' from source: play vars 30529 1726882652.58337: variable 'interface' from source: play vars 30529 1726882652.58410: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:37:32 -0400 (0:00:00.067) 0:01:06.610 ****** 30529 1726882652.58444: entering _queue_task() for managed_node1/set_fact 30529 1726882652.59025: worker is 1 (out of 1 available) 30529 1726882652.59036: exiting _queue_task() for managed_node1/set_fact 30529 1726882652.59062: done queuing things up, now waiting for results queue to drain 30529 1726882652.59068: waiting for pending results... 30529 1726882652.59513: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882652.59519: in run() - task 12673a56-9f93-b0f1-edc0-00000000166e 30529 1726882652.59522: variable 'ansible_search_path' from source: unknown 30529 1726882652.59525: variable 'ansible_search_path' from source: unknown 30529 1726882652.59527: calling self._execute() 30529 1726882652.59904: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.59907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.59910: variable 'omit' from source: magic vars 30529 1726882652.60107: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.60134: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.60375: variable 'profile_stat' from source: set_fact 30529 1726882652.60384: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882652.60387: when evaluation is False, skipping this task 30529 1726882652.60389: _execute() done 30529 1726882652.60400: dumping result to json 30529 1726882652.60402: done dumping result, returning 30529 1726882652.60409: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000166e] 30529 1726882652.60414: sending task result for task 12673a56-9f93-b0f1-edc0-00000000166e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882652.60555: no more pending results, returning what we have 30529 1726882652.60565: results queue empty 30529 1726882652.60567: checking for any_errors_fatal 30529 1726882652.60575: done checking for any_errors_fatal 30529 1726882652.60576: checking for max_fail_percentage 30529 1726882652.60586: done checking for max_fail_percentage 30529 1726882652.60587: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.60588: done checking to see if all hosts have failed 30529 1726882652.60591: getting the remaining hosts for this loop 30529 1726882652.60595: done getting the remaining hosts for this loop 30529 1726882652.60600: getting the next task for host managed_node1 30529 1726882652.60611: done getting next task for host managed_node1 30529 1726882652.60614: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30529 1726882652.60619: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.60624: getting variables 30529 1726882652.60626: in VariableManager get_vars() 30529 1726882652.60664: Calling all_inventory to load vars for managed_node1 30529 1726882652.60667: Calling groups_inventory to load vars for managed_node1 30529 1726882652.60673: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.60911: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.60917: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.60921: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.61552: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000166e 30529 1726882652.62066: WORKER PROCESS EXITING 30529 1726882652.62905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.64956: done with get_vars() 30529 1726882652.64983: done getting variables 30529 1726882652.65198: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.65377: variable 'profile' from source: play vars 30529 1726882652.65381: variable 'interface' from source: play vars 30529 1726882652.65440: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:37:32 -0400 (0:00:00.070) 0:01:06.681 ****** 30529 1726882652.65538: entering _queue_task() for managed_node1/assert 30529 1726882652.65957: worker is 1 (out of 1 available) 30529 1726882652.65969: exiting _queue_task() for managed_node1/assert 30529 1726882652.65983: done queuing things up, now waiting for results queue to drain 30529 1726882652.65985: waiting for pending results... 30529 1726882652.66318: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' 30529 1726882652.66438: in run() - task 12673a56-9f93-b0f1-edc0-0000000015d5 30529 1726882652.66458: variable 'ansible_search_path' from source: unknown 30529 1726882652.66464: variable 'ansible_search_path' from source: unknown 30529 1726882652.66506: calling self._execute() 30529 1726882652.66610: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.66625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.66645: variable 'omit' from source: magic vars 30529 1726882652.67055: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.67059: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.67066: variable 'omit' from source: magic vars 30529 1726882652.67117: variable 'omit' from source: magic vars 30529 1726882652.67228: variable 'profile' from source: play vars 30529 1726882652.67273: variable 'interface' from source: play vars 30529 1726882652.67317: variable 'interface' from source: play vars 30529 1726882652.67341: variable 'omit' from source: magic vars 30529 1726882652.67401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882652.67445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882652.67473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882652.67598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882652.67603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882652.67605: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882652.67607: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.67609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.67698: Set connection var ansible_shell_executable to /bin/sh 30529 1726882652.67701: Set connection var ansible_pipelining to False 30529 1726882652.67703: Set connection var ansible_shell_type to sh 30529 1726882652.67715: Set connection var ansible_timeout to 10 30529 1726882652.67798: Set connection var ansible_connection to ssh 30529 1726882652.67801: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882652.67803: variable 'ansible_shell_executable' from source: unknown 30529 1726882652.67805: variable 'ansible_connection' from source: unknown 30529 1726882652.67807: variable 'ansible_module_compression' from source: unknown 30529 1726882652.67809: variable 'ansible_shell_type' from source: unknown 30529 1726882652.67810: variable 'ansible_shell_executable' from source: unknown 30529 1726882652.67812: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.67814: variable 'ansible_pipelining' from source: unknown 30529 1726882652.67816: variable 'ansible_timeout' from source: unknown 30529 1726882652.67818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.67965: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882652.67982: variable 'omit' from source: magic vars 30529 1726882652.67997: starting attempt loop 30529 1726882652.68004: running the handler 30529 1726882652.68134: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882652.68145: Evaluated conditional (not lsr_net_profile_exists): True 30529 1726882652.68154: handler run complete 30529 1726882652.68177: attempt loop complete, returning result 30529 1726882652.68184: _execute() done 30529 1726882652.68192: dumping result to json 30529 1726882652.68285: done dumping result, returning 30529 1726882652.68288: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' [12673a56-9f93-b0f1-edc0-0000000015d5] 30529 1726882652.68295: sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d5 30529 1726882652.68364: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000015d5 30529 1726882652.68366: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882652.68440: no more pending results, returning what we have 30529 1726882652.68444: results queue empty 30529 1726882652.68445: checking for any_errors_fatal 30529 1726882652.68453: done checking for any_errors_fatal 30529 1726882652.68454: checking for max_fail_percentage 30529 1726882652.68456: done checking for max_fail_percentage 30529 1726882652.68457: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.68458: done checking to see if all hosts have failed 30529 1726882652.68458: getting the remaining hosts for this loop 30529 1726882652.68460: done getting the remaining hosts for this loop 30529 1726882652.68464: getting the next task for host managed_node1 30529 1726882652.68474: done getting next task for host managed_node1 30529 1726882652.68477: ^ task is: TASK: Conditional asserts 30529 1726882652.68480: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.68486: getting variables 30529 1726882652.68488: in VariableManager get_vars() 30529 1726882652.68532: Calling all_inventory to load vars for managed_node1 30529 1726882652.68535: Calling groups_inventory to load vars for managed_node1 30529 1726882652.68539: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.68551: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.68555: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.68558: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.70240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.71827: done with get_vars() 30529 1726882652.71856: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:37:32 -0400 (0:00:00.064) 0:01:06.745 ****** 30529 1726882652.71971: entering _queue_task() for managed_node1/include_tasks 30529 1726882652.72429: worker is 1 (out of 1 available) 30529 1726882652.72440: exiting _queue_task() for managed_node1/include_tasks 30529 1726882652.72452: done queuing things up, now waiting for results queue to drain 30529 1726882652.72453: waiting for pending results... 30529 1726882652.72698: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882652.72778: in run() - task 12673a56-9f93-b0f1-edc0-00000000100b 30529 1726882652.72811: variable 'ansible_search_path' from source: unknown 30529 1726882652.72822: variable 'ansible_search_path' from source: unknown 30529 1726882652.73117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882652.75776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882652.75953: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882652.75958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882652.75961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882652.75986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882652.76082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882652.76117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882652.76143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882652.76183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882652.76210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882652.76375: dumping result to json 30529 1726882652.76383: done dumping result, returning 30529 1726882652.76406: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-00000000100b] 30529 1726882652.76421: sending task result for task 12673a56-9f93-b0f1-edc0-00000000100b 30529 1726882652.76598: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000100b 30529 1726882652.76601: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } 30529 1726882652.76671: no more pending results, returning what we have 30529 1726882652.76675: results queue empty 30529 1726882652.76676: checking for any_errors_fatal 30529 1726882652.76681: done checking for any_errors_fatal 30529 1726882652.76682: checking for max_fail_percentage 30529 1726882652.76684: done checking for max_fail_percentage 30529 1726882652.76685: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.76687: done checking to see if all hosts have failed 30529 1726882652.76687: getting the remaining hosts for this loop 30529 1726882652.76692: done getting the remaining hosts for this loop 30529 1726882652.76801: getting the next task for host managed_node1 30529 1726882652.76808: done getting next task for host managed_node1 30529 1726882652.76811: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882652.76814: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.76820: getting variables 30529 1726882652.76822: in VariableManager get_vars() 30529 1726882652.76859: Calling all_inventory to load vars for managed_node1 30529 1726882652.76861: Calling groups_inventory to load vars for managed_node1 30529 1726882652.76865: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.76875: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.76879: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.76882: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.78664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.80297: done with get_vars() 30529 1726882652.80324: done getting variables 30529 1726882652.80392: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882652.80526: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:37:32 -0400 (0:00:00.086) 0:01:06.831 ****** 30529 1726882652.80567: entering _queue_task() for managed_node1/debug 30529 1726882652.80960: worker is 1 (out of 1 available) 30529 1726882652.80974: exiting _queue_task() for managed_node1/debug 30529 1726882652.81102: done queuing things up, now waiting for results queue to drain 30529 1726882652.81104: waiting for pending results... 30529 1726882652.81414: running TaskExecutor() for managed_node1/TASK: Success in test 'I can remove an existing profile without taking it down' 30529 1726882652.81427: in run() - task 12673a56-9f93-b0f1-edc0-00000000100c 30529 1726882652.81451: variable 'ansible_search_path' from source: unknown 30529 1726882652.81461: variable 'ansible_search_path' from source: unknown 30529 1726882652.81510: calling self._execute() 30529 1726882652.81609: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.81630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.81647: variable 'omit' from source: magic vars 30529 1726882652.82313: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.82332: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.82345: variable 'omit' from source: magic vars 30529 1726882652.82402: variable 'omit' from source: magic vars 30529 1726882652.82512: variable 'lsr_description' from source: include params 30529 1726882652.82603: variable 'omit' from source: magic vars 30529 1726882652.82607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882652.82634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882652.82660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882652.82682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882652.82712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882652.82748: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882652.82757: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.82766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.82884: Set connection var ansible_shell_executable to /bin/sh 30529 1726882652.82901: Set connection var ansible_pipelining to False 30529 1726882652.82909: Set connection var ansible_shell_type to sh 30529 1726882652.83037: Set connection var ansible_timeout to 10 30529 1726882652.83040: Set connection var ansible_connection to ssh 30529 1726882652.83042: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882652.83044: variable 'ansible_shell_executable' from source: unknown 30529 1726882652.83046: variable 'ansible_connection' from source: unknown 30529 1726882652.83048: variable 'ansible_module_compression' from source: unknown 30529 1726882652.83050: variable 'ansible_shell_type' from source: unknown 30529 1726882652.83051: variable 'ansible_shell_executable' from source: unknown 30529 1726882652.83053: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.83055: variable 'ansible_pipelining' from source: unknown 30529 1726882652.83057: variable 'ansible_timeout' from source: unknown 30529 1726882652.83058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.83154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882652.83169: variable 'omit' from source: magic vars 30529 1726882652.83176: starting attempt loop 30529 1726882652.83182: running the handler 30529 1726882652.83234: handler run complete 30529 1726882652.83258: attempt loop complete, returning result 30529 1726882652.83265: _execute() done 30529 1726882652.83271: dumping result to json 30529 1726882652.83278: done dumping result, returning 30529 1726882652.83296: done running TaskExecutor() for managed_node1/TASK: Success in test 'I can remove an existing profile without taking it down' [12673a56-9f93-b0f1-edc0-00000000100c] 30529 1726882652.83307: sending task result for task 12673a56-9f93-b0f1-edc0-00000000100c ok: [managed_node1] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 30529 1726882652.83540: no more pending results, returning what we have 30529 1726882652.83544: results queue empty 30529 1726882652.83545: checking for any_errors_fatal 30529 1726882652.83552: done checking for any_errors_fatal 30529 1726882652.83553: checking for max_fail_percentage 30529 1726882652.83554: done checking for max_fail_percentage 30529 1726882652.83555: checking to see if all hosts have failed and the running result is not ok 30529 1726882652.83556: done checking to see if all hosts have failed 30529 1726882652.83557: getting the remaining hosts for this loop 30529 1726882652.83559: done getting the remaining hosts for this loop 30529 1726882652.83562: getting the next task for host managed_node1 30529 1726882652.83570: done getting next task for host managed_node1 30529 1726882652.83686: ^ task is: TASK: Cleanup 30529 1726882652.83691: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882652.83702: getting variables 30529 1726882652.83703: in VariableManager get_vars() 30529 1726882652.83738: Calling all_inventory to load vars for managed_node1 30529 1726882652.83740: Calling groups_inventory to load vars for managed_node1 30529 1726882652.83744: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.83756: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.83759: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.83762: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.84319: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000100c 30529 1726882652.84323: WORKER PROCESS EXITING 30529 1726882652.85385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.88933: done with get_vars() 30529 1726882652.88958: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:37:32 -0400 (0:00:00.084) 0:01:06.916 ****** 30529 1726882652.89068: entering _queue_task() for managed_node1/include_tasks 30529 1726882652.89437: worker is 1 (out of 1 available) 30529 1726882652.89450: exiting _queue_task() for managed_node1/include_tasks 30529 1726882652.89577: done queuing things up, now waiting for results queue to drain 30529 1726882652.89579: waiting for pending results... 30529 1726882652.89775: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882652.89905: in run() - task 12673a56-9f93-b0f1-edc0-000000001010 30529 1726882652.89929: variable 'ansible_search_path' from source: unknown 30529 1726882652.89936: variable 'ansible_search_path' from source: unknown 30529 1726882652.89983: variable 'lsr_cleanup' from source: include params 30529 1726882652.90402: variable 'lsr_cleanup' from source: include params 30529 1726882652.90741: variable 'omit' from source: magic vars 30529 1726882652.90961: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882652.90965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882652.90969: variable 'omit' from source: magic vars 30529 1726882652.91540: variable 'ansible_distribution_major_version' from source: facts 30529 1726882652.91555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882652.91567: variable 'item' from source: unknown 30529 1726882652.91665: variable 'item' from source: unknown 30529 1726882652.91765: variable 'item' from source: unknown 30529 1726882652.91922: variable 'item' from source: unknown 30529 1726882652.92609: dumping result to json 30529 1726882652.92613: done dumping result, returning 30529 1726882652.92615: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-000000001010] 30529 1726882652.92617: sending task result for task 12673a56-9f93-b0f1-edc0-000000001010 30529 1726882652.92661: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001010 30529 1726882652.92664: WORKER PROCESS EXITING 30529 1726882652.92734: no more pending results, returning what we have 30529 1726882652.92739: in VariableManager get_vars() 30529 1726882652.92787: Calling all_inventory to load vars for managed_node1 30529 1726882652.92795: Calling groups_inventory to load vars for managed_node1 30529 1726882652.92799: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882652.92813: Calling all_plugins_play to load vars for managed_node1 30529 1726882652.92817: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882652.92820: Calling groups_plugins_play to load vars for managed_node1 30529 1726882652.95605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882652.98976: done with get_vars() 30529 1726882652.99014: variable 'ansible_search_path' from source: unknown 30529 1726882652.99016: variable 'ansible_search_path' from source: unknown 30529 1726882652.99058: we have included files to process 30529 1726882652.99060: generating all_blocks data 30529 1726882652.99062: done generating all_blocks data 30529 1726882652.99070: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882652.99071: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882652.99073: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882652.99506: done processing included file 30529 1726882652.99508: iterating over new_blocks loaded from include file 30529 1726882652.99509: in VariableManager get_vars() 30529 1726882652.99527: done with get_vars() 30529 1726882652.99529: filtering new block on tags 30529 1726882652.99555: done filtering new block on tags 30529 1726882652.99558: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882652.99563: extending task lists for all hosts with included blocks 30529 1726882653.00846: done extending task lists 30529 1726882653.00848: done processing included files 30529 1726882653.00848: results queue empty 30529 1726882653.00849: checking for any_errors_fatal 30529 1726882653.00852: done checking for any_errors_fatal 30529 1726882653.00853: checking for max_fail_percentage 30529 1726882653.00854: done checking for max_fail_percentage 30529 1726882653.00855: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.00856: done checking to see if all hosts have failed 30529 1726882653.00857: getting the remaining hosts for this loop 30529 1726882653.00858: done getting the remaining hosts for this loop 30529 1726882653.00860: getting the next task for host managed_node1 30529 1726882653.00864: done getting next task for host managed_node1 30529 1726882653.00866: ^ task is: TASK: Cleanup profile and device 30529 1726882653.00869: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.00872: getting variables 30529 1726882653.00873: in VariableManager get_vars() 30529 1726882653.00884: Calling all_inventory to load vars for managed_node1 30529 1726882653.00887: Calling groups_inventory to load vars for managed_node1 30529 1726882653.00891: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.00898: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.00900: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.00903: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.07682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.09233: done with get_vars() 30529 1726882653.09262: done getting variables 30529 1726882653.09315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:37:33 -0400 (0:00:00.202) 0:01:07.119 ****** 30529 1726882653.09345: entering _queue_task() for managed_node1/shell 30529 1726882653.09728: worker is 1 (out of 1 available) 30529 1726882653.09739: exiting _queue_task() for managed_node1/shell 30529 1726882653.09753: done queuing things up, now waiting for results queue to drain 30529 1726882653.09755: waiting for pending results... 30529 1726882653.10125: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882653.10185: in run() - task 12673a56-9f93-b0f1-edc0-0000000016ad 30529 1726882653.10219: variable 'ansible_search_path' from source: unknown 30529 1726882653.10232: variable 'ansible_search_path' from source: unknown 30529 1726882653.10328: calling self._execute() 30529 1726882653.10398: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.10414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.10436: variable 'omit' from source: magic vars 30529 1726882653.10840: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.10859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.10877: variable 'omit' from source: magic vars 30529 1726882653.10979: variable 'omit' from source: magic vars 30529 1726882653.11101: variable 'interface' from source: play vars 30529 1726882653.11129: variable 'omit' from source: magic vars 30529 1726882653.11175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882653.11227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.11254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882653.11300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.11304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.11337: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.11346: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.11398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.11473: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.11485: Set connection var ansible_pipelining to False 30529 1726882653.11498: Set connection var ansible_shell_type to sh 30529 1726882653.11514: Set connection var ansible_timeout to 10 30529 1726882653.11521: Set connection var ansible_connection to ssh 30529 1726882653.11535: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.11560: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.11569: variable 'ansible_connection' from source: unknown 30529 1726882653.11597: variable 'ansible_module_compression' from source: unknown 30529 1726882653.11601: variable 'ansible_shell_type' from source: unknown 30529 1726882653.11603: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.11605: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.11607: variable 'ansible_pipelining' from source: unknown 30529 1726882653.11609: variable 'ansible_timeout' from source: unknown 30529 1726882653.11612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.11766: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.11974: variable 'omit' from source: magic vars 30529 1726882653.11977: starting attempt loop 30529 1726882653.11980: running the handler 30529 1726882653.11983: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.11986: _low_level_execute_command(): starting 30529 1726882653.11988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882653.13213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882653.13230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882653.13313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882653.13356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882653.13375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.13402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.13527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.15214: stdout chunk (state=3): >>>/root <<< 30529 1726882653.15502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882653.15505: stdout chunk (state=3): >>><<< 30529 1726882653.15507: stderr chunk (state=3): >>><<< 30529 1726882653.15513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882653.15516: _low_level_execute_command(): starting 30529 1726882653.15518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669 `" && echo ansible-tmp-1726882653.1538582-33724-79826702561669="` echo /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669 `" ) && sleep 0' 30529 1726882653.16009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882653.16020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882653.16076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882653.16127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882653.16139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.16156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.16225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.18110: stdout chunk (state=3): >>>ansible-tmp-1726882653.1538582-33724-79826702561669=/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669 <<< 30529 1726882653.18277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882653.18281: stdout chunk (state=3): >>><<< 30529 1726882653.18285: stderr chunk (state=3): >>><<< 30529 1726882653.18314: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882653.1538582-33724-79826702561669=/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882653.18498: variable 'ansible_module_compression' from source: unknown 30529 1726882653.18502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882653.18504: variable 'ansible_facts' from source: unknown 30529 1726882653.18554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py 30529 1726882653.18714: Sending initial data 30529 1726882653.18733: Sent initial data (155 bytes) 30529 1726882653.19366: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882653.19399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882653.19512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.19533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.19622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.21187: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882653.21332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882653.21383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpkvnk62jt /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py <<< 30529 1726882653.21387: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py" <<< 30529 1726882653.21698: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpkvnk62jt" to remote "/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py" <<< 30529 1726882653.22246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882653.22324: stderr chunk (state=3): >>><<< 30529 1726882653.22337: stdout chunk (state=3): >>><<< 30529 1726882653.22408: done transferring module to remote 30529 1726882653.22423: _low_level_execute_command(): starting 30529 1726882653.22432: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/ /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py && sleep 0' 30529 1726882653.23070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882653.23074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882653.23138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882653.23142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882653.23408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.23473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.23537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.25332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882653.25336: stderr chunk (state=3): >>><<< 30529 1726882653.25339: stdout chunk (state=3): >>><<< 30529 1726882653.25360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882653.25363: _low_level_execute_command(): starting 30529 1726882653.25367: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/AnsiballZ_command.py && sleep 0' 30529 1726882653.26428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882653.26432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882653.26434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882653.26436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882653.26438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882653.26440: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882653.26442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882653.26458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882653.26537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.26607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.26681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.46829: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (925d78f3-a59a-474c-aff9-927d62a7a239) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:37:33.416378", "end": "2024-09-20 21:37:33.466944", "delta": "0:00:00.050566", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882653.48922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882653.48949: stderr chunk (state=3): >>><<< 30529 1726882653.48952: stdout chunk (state=3): >>><<< 30529 1726882653.48969: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (925d78f3-a59a-474c-aff9-927d62a7a239) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:37:33.416378", "end": "2024-09-20 21:37:33.466944", "delta": "0:00:00.050566", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882653.49007: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882653.49014: _low_level_execute_command(): starting 30529 1726882653.49019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882653.1538582-33724-79826702561669/ > /dev/null 2>&1 && sleep 0' 30529 1726882653.49453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882653.49456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882653.49487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882653.49492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882653.49498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882653.49501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882653.49552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882653.49559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882653.49562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882653.49604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882653.51384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882653.51411: stderr chunk (state=3): >>><<< 30529 1726882653.51415: stdout chunk (state=3): >>><<< 30529 1726882653.51427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882653.51433: handler run complete 30529 1726882653.51451: Evaluated conditional (False): False 30529 1726882653.51461: attempt loop complete, returning result 30529 1726882653.51464: _execute() done 30529 1726882653.51467: dumping result to json 30529 1726882653.51473: done dumping result, returning 30529 1726882653.51480: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-0000000016ad] 30529 1726882653.51484: sending task result for task 12673a56-9f93-b0f1-edc0-0000000016ad 30529 1726882653.51580: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000016ad 30529 1726882653.51583: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.050566", "end": "2024-09-20 21:37:33.466944", "rc": 0, "start": "2024-09-20 21:37:33.416378" } STDOUT: Connection 'statebr' (925d78f3-a59a-474c-aff9-927d62a7a239) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30529 1726882653.51644: no more pending results, returning what we have 30529 1726882653.51647: results queue empty 30529 1726882653.51648: checking for any_errors_fatal 30529 1726882653.51649: done checking for any_errors_fatal 30529 1726882653.51650: checking for max_fail_percentage 30529 1726882653.51652: done checking for max_fail_percentage 30529 1726882653.51652: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.51653: done checking to see if all hosts have failed 30529 1726882653.51654: getting the remaining hosts for this loop 30529 1726882653.51655: done getting the remaining hosts for this loop 30529 1726882653.51659: getting the next task for host managed_node1 30529 1726882653.51670: done getting next task for host managed_node1 30529 1726882653.51672: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882653.51674: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.51678: getting variables 30529 1726882653.51680: in VariableManager get_vars() 30529 1726882653.51721: Calling all_inventory to load vars for managed_node1 30529 1726882653.51723: Calling groups_inventory to load vars for managed_node1 30529 1726882653.51727: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.51738: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.51741: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.51743: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.52564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.53433: done with get_vars() 30529 1726882653.53449: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 21:37:33 -0400 (0:00:00.441) 0:01:07.561 ****** 30529 1726882653.53519: entering _queue_task() for managed_node1/include_tasks 30529 1726882653.53746: worker is 1 (out of 1 available) 30529 1726882653.53759: exiting _queue_task() for managed_node1/include_tasks 30529 1726882653.53771: done queuing things up, now waiting for results queue to drain 30529 1726882653.53774: waiting for pending results... 30529 1726882653.53953: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882653.54022: in run() - task 12673a56-9f93-b0f1-edc0-000000000015 30529 1726882653.54034: variable 'ansible_search_path' from source: unknown 30529 1726882653.54062: calling self._execute() 30529 1726882653.54140: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.54144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.54153: variable 'omit' from source: magic vars 30529 1726882653.54422: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.54437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.54440: _execute() done 30529 1726882653.54444: dumping result to json 30529 1726882653.54446: done dumping result, returning 30529 1726882653.54449: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-000000000015] 30529 1726882653.54455: sending task result for task 12673a56-9f93-b0f1-edc0-000000000015 30529 1726882653.54599: no more pending results, returning what we have 30529 1726882653.54604: in VariableManager get_vars() 30529 1726882653.54641: Calling all_inventory to load vars for managed_node1 30529 1726882653.54643: Calling groups_inventory to load vars for managed_node1 30529 1726882653.54647: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.54657: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.54660: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.54662: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.55355: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000015 30529 1726882653.55358: WORKER PROCESS EXITING 30529 1726882653.55605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.56451: done with get_vars() 30529 1726882653.56466: variable 'ansible_search_path' from source: unknown 30529 1726882653.56476: we have included files to process 30529 1726882653.56477: generating all_blocks data 30529 1726882653.56478: done generating all_blocks data 30529 1726882653.56482: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882653.56483: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882653.56484: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882653.56742: in VariableManager get_vars() 30529 1726882653.56755: done with get_vars() 30529 1726882653.56779: in VariableManager get_vars() 30529 1726882653.56794: done with get_vars() 30529 1726882653.56822: in VariableManager get_vars() 30529 1726882653.56833: done with get_vars() 30529 1726882653.56858: in VariableManager get_vars() 30529 1726882653.56867: done with get_vars() 30529 1726882653.56894: in VariableManager get_vars() 30529 1726882653.56908: done with get_vars() 30529 1726882653.57158: in VariableManager get_vars() 30529 1726882653.57169: done with get_vars() 30529 1726882653.57177: done processing included file 30529 1726882653.57178: iterating over new_blocks loaded from include file 30529 1726882653.57179: in VariableManager get_vars() 30529 1726882653.57185: done with get_vars() 30529 1726882653.57186: filtering new block on tags 30529 1726882653.57252: done filtering new block on tags 30529 1726882653.57254: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882653.57257: extending task lists for all hosts with included blocks 30529 1726882653.57278: done extending task lists 30529 1726882653.57278: done processing included files 30529 1726882653.57279: results queue empty 30529 1726882653.57279: checking for any_errors_fatal 30529 1726882653.57284: done checking for any_errors_fatal 30529 1726882653.57284: checking for max_fail_percentage 30529 1726882653.57285: done checking for max_fail_percentage 30529 1726882653.57286: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.57286: done checking to see if all hosts have failed 30529 1726882653.57287: getting the remaining hosts for this loop 30529 1726882653.57288: done getting the remaining hosts for this loop 30529 1726882653.57291: getting the next task for host managed_node1 30529 1726882653.57295: done getting next task for host managed_node1 30529 1726882653.57296: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882653.57297: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.57299: getting variables 30529 1726882653.57299: in VariableManager get_vars() 30529 1726882653.57306: Calling all_inventory to load vars for managed_node1 30529 1726882653.57308: Calling groups_inventory to load vars for managed_node1 30529 1726882653.57309: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.57313: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.57315: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.57316: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.57951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.58895: done with get_vars() 30529 1726882653.58910: done getting variables 30529 1726882653.58935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882653.59020: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:37:33 -0400 (0:00:00.055) 0:01:07.616 ****** 30529 1726882653.59040: entering _queue_task() for managed_node1/debug 30529 1726882653.59304: worker is 1 (out of 1 available) 30529 1726882653.59317: exiting _queue_task() for managed_node1/debug 30529 1726882653.59329: done queuing things up, now waiting for results queue to drain 30529 1726882653.59331: waiting for pending results... 30529 1726882653.59505: running TaskExecutor() for managed_node1/TASK: TEST: I can take a profile down that is absent 30529 1726882653.59575: in run() - task 12673a56-9f93-b0f1-edc0-000000001744 30529 1726882653.59586: variable 'ansible_search_path' from source: unknown 30529 1726882653.59594: variable 'ansible_search_path' from source: unknown 30529 1726882653.59619: calling self._execute() 30529 1726882653.59692: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.59697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.59704: variable 'omit' from source: magic vars 30529 1726882653.59964: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.59974: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.59980: variable 'omit' from source: magic vars 30529 1726882653.60010: variable 'omit' from source: magic vars 30529 1726882653.60079: variable 'lsr_description' from source: include params 30529 1726882653.60097: variable 'omit' from source: magic vars 30529 1726882653.60128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882653.60155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.60171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882653.60184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.60197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.60223: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.60226: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.60229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.60299: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.60304: Set connection var ansible_pipelining to False 30529 1726882653.60307: Set connection var ansible_shell_type to sh 30529 1726882653.60321: Set connection var ansible_timeout to 10 30529 1726882653.60325: Set connection var ansible_connection to ssh 30529 1726882653.60327: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.60343: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.60346: variable 'ansible_connection' from source: unknown 30529 1726882653.60349: variable 'ansible_module_compression' from source: unknown 30529 1726882653.60351: variable 'ansible_shell_type' from source: unknown 30529 1726882653.60354: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.60356: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.60358: variable 'ansible_pipelining' from source: unknown 30529 1726882653.60360: variable 'ansible_timeout' from source: unknown 30529 1726882653.60365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.60486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.60498: variable 'omit' from source: magic vars 30529 1726882653.60501: starting attempt loop 30529 1726882653.60504: running the handler 30529 1726882653.60564: handler run complete 30529 1726882653.60567: attempt loop complete, returning result 30529 1726882653.60570: _execute() done 30529 1726882653.60572: dumping result to json 30529 1726882653.60575: done dumping result, returning 30529 1726882653.60577: done running TaskExecutor() for managed_node1/TASK: TEST: I can take a profile down that is absent [12673a56-9f93-b0f1-edc0-000000001744] 30529 1726882653.60579: sending task result for task 12673a56-9f93-b0f1-edc0-000000001744 30529 1726882653.60674: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001744 30529 1726882653.60677: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I can take a profile down that is absent ########## 30529 1726882653.60930: no more pending results, returning what we have 30529 1726882653.60933: results queue empty 30529 1726882653.60934: checking for any_errors_fatal 30529 1726882653.60935: done checking for any_errors_fatal 30529 1726882653.60936: checking for max_fail_percentage 30529 1726882653.60937: done checking for max_fail_percentage 30529 1726882653.60938: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.60939: done checking to see if all hosts have failed 30529 1726882653.60940: getting the remaining hosts for this loop 30529 1726882653.60941: done getting the remaining hosts for this loop 30529 1726882653.60944: getting the next task for host managed_node1 30529 1726882653.60949: done getting next task for host managed_node1 30529 1726882653.60951: ^ task is: TASK: Show item 30529 1726882653.60954: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.60957: getting variables 30529 1726882653.60959: in VariableManager get_vars() 30529 1726882653.60988: Calling all_inventory to load vars for managed_node1 30529 1726882653.60995: Calling groups_inventory to load vars for managed_node1 30529 1726882653.60998: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.61013: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.61016: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.61019: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.62127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.62992: done with get_vars() 30529 1726882653.63008: done getting variables 30529 1726882653.63045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:37:33 -0400 (0:00:00.040) 0:01:07.656 ****** 30529 1726882653.63067: entering _queue_task() for managed_node1/debug 30529 1726882653.63281: worker is 1 (out of 1 available) 30529 1726882653.63296: exiting _queue_task() for managed_node1/debug 30529 1726882653.63309: done queuing things up, now waiting for results queue to drain 30529 1726882653.63310: waiting for pending results... 30529 1726882653.63486: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882653.63555: in run() - task 12673a56-9f93-b0f1-edc0-000000001745 30529 1726882653.63567: variable 'ansible_search_path' from source: unknown 30529 1726882653.63573: variable 'ansible_search_path' from source: unknown 30529 1726882653.63802: variable 'omit' from source: magic vars 30529 1726882653.63806: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.63809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.63812: variable 'omit' from source: magic vars 30529 1726882653.64483: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.64498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.64507: variable 'omit' from source: magic vars 30529 1726882653.64541: variable 'omit' from source: magic vars 30529 1726882653.64584: variable 'item' from source: unknown 30529 1726882653.64651: variable 'item' from source: unknown 30529 1726882653.64667: variable 'omit' from source: magic vars 30529 1726882653.64706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882653.64744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.64785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882653.64788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.64793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.64822: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.64825: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.64828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.64999: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.65002: Set connection var ansible_pipelining to False 30529 1726882653.65004: Set connection var ansible_shell_type to sh 30529 1726882653.65006: Set connection var ansible_timeout to 10 30529 1726882653.65008: Set connection var ansible_connection to ssh 30529 1726882653.65010: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.65017: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.65019: variable 'ansible_connection' from source: unknown 30529 1726882653.65021: variable 'ansible_module_compression' from source: unknown 30529 1726882653.65023: variable 'ansible_shell_type' from source: unknown 30529 1726882653.65025: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.65027: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.65029: variable 'ansible_pipelining' from source: unknown 30529 1726882653.65031: variable 'ansible_timeout' from source: unknown 30529 1726882653.65098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.65183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.65202: variable 'omit' from source: magic vars 30529 1726882653.65212: starting attempt loop 30529 1726882653.65219: running the handler 30529 1726882653.65277: variable 'lsr_description' from source: include params 30529 1726882653.65347: variable 'lsr_description' from source: include params 30529 1726882653.65369: handler run complete 30529 1726882653.65390: attempt loop complete, returning result 30529 1726882653.65453: variable 'item' from source: unknown 30529 1726882653.65485: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 30529 1726882653.65907: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.65910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.65913: variable 'omit' from source: magic vars 30529 1726882653.65915: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.65917: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.65919: variable 'omit' from source: magic vars 30529 1726882653.65931: variable 'omit' from source: magic vars 30529 1726882653.65973: variable 'item' from source: unknown 30529 1726882653.66047: variable 'item' from source: unknown 30529 1726882653.66065: variable 'omit' from source: magic vars 30529 1726882653.66086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.66123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.66127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.66130: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.66132: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.66142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.66216: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.66249: Set connection var ansible_pipelining to False 30529 1726882653.66252: Set connection var ansible_shell_type to sh 30529 1726882653.66254: Set connection var ansible_timeout to 10 30529 1726882653.66256: Set connection var ansible_connection to ssh 30529 1726882653.66263: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.66287: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.66340: variable 'ansible_connection' from source: unknown 30529 1726882653.66344: variable 'ansible_module_compression' from source: unknown 30529 1726882653.66346: variable 'ansible_shell_type' from source: unknown 30529 1726882653.66348: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.66349: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.66353: variable 'ansible_pipelining' from source: unknown 30529 1726882653.66355: variable 'ansible_timeout' from source: unknown 30529 1726882653.66357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.66449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.66452: variable 'omit' from source: magic vars 30529 1726882653.66454: starting attempt loop 30529 1726882653.66456: running the handler 30529 1726882653.66474: variable 'lsr_setup' from source: include params 30529 1726882653.66545: variable 'lsr_setup' from source: include params 30529 1726882653.66604: handler run complete 30529 1726882653.66688: attempt loop complete, returning result 30529 1726882653.66691: variable 'item' from source: unknown 30529 1726882653.66711: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 30529 1726882653.67102: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.67106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.67108: variable 'omit' from source: magic vars 30529 1726882653.67111: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.67113: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.67115: variable 'omit' from source: magic vars 30529 1726882653.67117: variable 'omit' from source: magic vars 30529 1726882653.67119: variable 'item' from source: unknown 30529 1726882653.67173: variable 'item' from source: unknown 30529 1726882653.67191: variable 'omit' from source: magic vars 30529 1726882653.67223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.67234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.67244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.67257: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.67266: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.67274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.67355: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.67365: Set connection var ansible_pipelining to False 30529 1726882653.67373: Set connection var ansible_shell_type to sh 30529 1726882653.67386: Set connection var ansible_timeout to 10 30529 1726882653.67395: Set connection var ansible_connection to ssh 30529 1726882653.67405: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.67433: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.67440: variable 'ansible_connection' from source: unknown 30529 1726882653.67447: variable 'ansible_module_compression' from source: unknown 30529 1726882653.67453: variable 'ansible_shell_type' from source: unknown 30529 1726882653.67459: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.67466: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.67474: variable 'ansible_pipelining' from source: unknown 30529 1726882653.67481: variable 'ansible_timeout' from source: unknown 30529 1726882653.67488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.67570: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.67580: variable 'omit' from source: magic vars 30529 1726882653.67587: starting attempt loop 30529 1726882653.67592: running the handler 30529 1726882653.67613: variable 'lsr_test' from source: include params 30529 1726882653.67677: variable 'lsr_test' from source: include params 30529 1726882653.67698: handler run complete 30529 1726882653.67756: attempt loop complete, returning result 30529 1726882653.67759: variable 'item' from source: unknown 30529 1726882653.67813: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30529 1726882653.68020: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.68023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.68025: variable 'omit' from source: magic vars 30529 1726882653.68191: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.68197: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.68199: variable 'omit' from source: magic vars 30529 1726882653.68202: variable 'omit' from source: magic vars 30529 1726882653.68231: variable 'item' from source: unknown 30529 1726882653.68302: variable 'item' from source: unknown 30529 1726882653.68323: variable 'omit' from source: magic vars 30529 1726882653.68358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.68362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.68409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.68412: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.68415: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.68418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.68479: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.68489: Set connection var ansible_pipelining to False 30529 1726882653.68498: Set connection var ansible_shell_type to sh 30529 1726882653.68517: Set connection var ansible_timeout to 10 30529 1726882653.68525: Set connection var ansible_connection to ssh 30529 1726882653.68576: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.68579: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.68582: variable 'ansible_connection' from source: unknown 30529 1726882653.68584: variable 'ansible_module_compression' from source: unknown 30529 1726882653.68585: variable 'ansible_shell_type' from source: unknown 30529 1726882653.68587: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.68588: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.68590: variable 'ansible_pipelining' from source: unknown 30529 1726882653.68592: variable 'ansible_timeout' from source: unknown 30529 1726882653.68597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.68692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.68707: variable 'omit' from source: magic vars 30529 1726882653.68715: starting attempt loop 30529 1726882653.68734: running the handler 30529 1726882653.68752: variable 'lsr_assert' from source: include params 30529 1726882653.68843: variable 'lsr_assert' from source: include params 30529 1726882653.68847: handler run complete 30529 1726882653.68859: attempt loop complete, returning result 30529 1726882653.68878: variable 'item' from source: unknown 30529 1726882653.68943: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 30529 1726882653.69177: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.69180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.69183: variable 'omit' from source: magic vars 30529 1726882653.69398: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.69406: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.69414: variable 'omit' from source: magic vars 30529 1726882653.69416: variable 'omit' from source: magic vars 30529 1726882653.69418: variable 'item' from source: unknown 30529 1726882653.69464: variable 'item' from source: unknown 30529 1726882653.69484: variable 'omit' from source: magic vars 30529 1726882653.69508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.69528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.69542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.69626: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.69629: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.69632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.69650: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.69661: Set connection var ansible_pipelining to False 30529 1726882653.69668: Set connection var ansible_shell_type to sh 30529 1726882653.69681: Set connection var ansible_timeout to 10 30529 1726882653.69687: Set connection var ansible_connection to ssh 30529 1726882653.69699: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.69721: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.69733: variable 'ansible_connection' from source: unknown 30529 1726882653.69742: variable 'ansible_module_compression' from source: unknown 30529 1726882653.69754: variable 'ansible_shell_type' from source: unknown 30529 1726882653.69762: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.69769: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.69777: variable 'ansible_pipelining' from source: unknown 30529 1726882653.69844: variable 'ansible_timeout' from source: unknown 30529 1726882653.69847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.69888: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.69904: variable 'omit' from source: magic vars 30529 1726882653.69912: starting attempt loop 30529 1726882653.69919: running the handler 30529 1726882653.69941: variable 'lsr_assert_when' from source: include params 30529 1726882653.70012: variable 'lsr_assert_when' from source: include params 30529 1726882653.70110: variable 'network_provider' from source: set_fact 30529 1726882653.70147: handler run complete 30529 1726882653.70171: attempt loop complete, returning result 30529 1726882653.70197: variable 'item' from source: unknown 30529 1726882653.70280: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30529 1726882653.70438: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.70441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.70443: variable 'omit' from source: magic vars 30529 1726882653.70598: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.70605: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.70608: variable 'omit' from source: magic vars 30529 1726882653.70609: variable 'omit' from source: magic vars 30529 1726882653.70639: variable 'item' from source: unknown 30529 1726882653.70704: variable 'item' from source: unknown 30529 1726882653.70725: variable 'omit' from source: magic vars 30529 1726882653.70773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.70776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.70778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.70780: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.70782: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.70784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.70854: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.70862: Set connection var ansible_pipelining to False 30529 1726882653.70868: Set connection var ansible_shell_type to sh 30529 1726882653.70883: Set connection var ansible_timeout to 10 30529 1726882653.70930: Set connection var ansible_connection to ssh 30529 1726882653.70933: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.70935: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.70937: variable 'ansible_connection' from source: unknown 30529 1726882653.70938: variable 'ansible_module_compression' from source: unknown 30529 1726882653.70940: variable 'ansible_shell_type' from source: unknown 30529 1726882653.70942: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.70944: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.70945: variable 'ansible_pipelining' from source: unknown 30529 1726882653.70947: variable 'ansible_timeout' from source: unknown 30529 1726882653.70952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.71038: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.71050: variable 'omit' from source: magic vars 30529 1726882653.71057: starting attempt loop 30529 1726882653.71062: running the handler 30529 1726882653.71099: variable 'lsr_fail_debug' from source: play vars 30529 1726882653.71152: variable 'lsr_fail_debug' from source: play vars 30529 1726882653.71173: handler run complete 30529 1726882653.71199: attempt loop complete, returning result 30529 1726882653.71258: variable 'item' from source: unknown 30529 1726882653.71282: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882653.71598: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.71602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.71604: variable 'omit' from source: magic vars 30529 1726882653.71606: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.71614: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.71626: variable 'omit' from source: magic vars 30529 1726882653.71645: variable 'omit' from source: magic vars 30529 1726882653.71688: variable 'item' from source: unknown 30529 1726882653.71759: variable 'item' from source: unknown 30529 1726882653.71778: variable 'omit' from source: magic vars 30529 1726882653.71801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882653.71819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.71837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882653.71851: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882653.71859: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.71867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.71944: Set connection var ansible_shell_executable to /bin/sh 30529 1726882653.71955: Set connection var ansible_pipelining to False 30529 1726882653.72049: Set connection var ansible_shell_type to sh 30529 1726882653.72052: Set connection var ansible_timeout to 10 30529 1726882653.72055: Set connection var ansible_connection to ssh 30529 1726882653.72057: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882653.72059: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.72061: variable 'ansible_connection' from source: unknown 30529 1726882653.72062: variable 'ansible_module_compression' from source: unknown 30529 1726882653.72064: variable 'ansible_shell_type' from source: unknown 30529 1726882653.72066: variable 'ansible_shell_executable' from source: unknown 30529 1726882653.72068: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.72070: variable 'ansible_pipelining' from source: unknown 30529 1726882653.72071: variable 'ansible_timeout' from source: unknown 30529 1726882653.72073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.72136: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882653.72154: variable 'omit' from source: magic vars 30529 1726882653.72166: starting attempt loop 30529 1726882653.72172: running the handler 30529 1726882653.72196: variable 'lsr_cleanup' from source: include params 30529 1726882653.72256: variable 'lsr_cleanup' from source: include params 30529 1726882653.72283: handler run complete 30529 1726882653.72302: attempt loop complete, returning result 30529 1726882653.72321: variable 'item' from source: unknown 30529 1726882653.72386: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30529 1726882653.72594: dumping result to json 30529 1726882653.72598: done dumping result, returning 30529 1726882653.72600: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-000000001745] 30529 1726882653.72602: sending task result for task 12673a56-9f93-b0f1-edc0-000000001745 30529 1726882653.72648: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001745 30529 1726882653.72651: WORKER PROCESS EXITING 30529 1726882653.72745: no more pending results, returning what we have 30529 1726882653.72749: results queue empty 30529 1726882653.72750: checking for any_errors_fatal 30529 1726882653.72757: done checking for any_errors_fatal 30529 1726882653.72758: checking for max_fail_percentage 30529 1726882653.72760: done checking for max_fail_percentage 30529 1726882653.72761: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.72762: done checking to see if all hosts have failed 30529 1726882653.72762: getting the remaining hosts for this loop 30529 1726882653.72764: done getting the remaining hosts for this loop 30529 1726882653.72768: getting the next task for host managed_node1 30529 1726882653.72776: done getting next task for host managed_node1 30529 1726882653.72779: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882653.72782: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.72786: getting variables 30529 1726882653.72788: in VariableManager get_vars() 30529 1726882653.72837: Calling all_inventory to load vars for managed_node1 30529 1726882653.72840: Calling groups_inventory to load vars for managed_node1 30529 1726882653.72845: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.72856: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.72860: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.72864: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.74877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.76292: done with get_vars() 30529 1726882653.76320: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:37:33 -0400 (0:00:00.133) 0:01:07.790 ****** 30529 1726882653.76421: entering _queue_task() for managed_node1/include_tasks 30529 1726882653.77007: worker is 1 (out of 1 available) 30529 1726882653.77018: exiting _queue_task() for managed_node1/include_tasks 30529 1726882653.77031: done queuing things up, now waiting for results queue to drain 30529 1726882653.77032: waiting for pending results... 30529 1726882653.77492: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882653.77537: in run() - task 12673a56-9f93-b0f1-edc0-000000001746 30529 1726882653.77560: variable 'ansible_search_path' from source: unknown 30529 1726882653.77567: variable 'ansible_search_path' from source: unknown 30529 1726882653.77618: calling self._execute() 30529 1726882653.77725: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.77736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.77754: variable 'omit' from source: magic vars 30529 1726882653.78154: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.78238: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.78241: _execute() done 30529 1726882653.78244: dumping result to json 30529 1726882653.78247: done dumping result, returning 30529 1726882653.78249: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-000000001746] 30529 1726882653.78251: sending task result for task 12673a56-9f93-b0f1-edc0-000000001746 30529 1726882653.78599: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001746 30529 1726882653.78603: WORKER PROCESS EXITING 30529 1726882653.78629: no more pending results, returning what we have 30529 1726882653.78634: in VariableManager get_vars() 30529 1726882653.78671: Calling all_inventory to load vars for managed_node1 30529 1726882653.78674: Calling groups_inventory to load vars for managed_node1 30529 1726882653.78678: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.78690: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.78695: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.78699: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.80243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.82465: done with get_vars() 30529 1726882653.82491: variable 'ansible_search_path' from source: unknown 30529 1726882653.82492: variable 'ansible_search_path' from source: unknown 30529 1726882653.82535: we have included files to process 30529 1726882653.82537: generating all_blocks data 30529 1726882653.82539: done generating all_blocks data 30529 1726882653.82544: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882653.82545: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882653.82547: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882653.82863: in VariableManager get_vars() 30529 1726882653.82890: done with get_vars() 30529 1726882653.83114: done processing included file 30529 1726882653.83116: iterating over new_blocks loaded from include file 30529 1726882653.83118: in VariableManager get_vars() 30529 1726882653.83133: done with get_vars() 30529 1726882653.83135: filtering new block on tags 30529 1726882653.83172: done filtering new block on tags 30529 1726882653.83175: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882653.83180: extending task lists for all hosts with included blocks 30529 1726882653.83792: done extending task lists 30529 1726882653.83796: done processing included files 30529 1726882653.83797: results queue empty 30529 1726882653.83797: checking for any_errors_fatal 30529 1726882653.83805: done checking for any_errors_fatal 30529 1726882653.83806: checking for max_fail_percentage 30529 1726882653.83807: done checking for max_fail_percentage 30529 1726882653.83808: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.83809: done checking to see if all hosts have failed 30529 1726882653.83810: getting the remaining hosts for this loop 30529 1726882653.83816: done getting the remaining hosts for this loop 30529 1726882653.83819: getting the next task for host managed_node1 30529 1726882653.83824: done getting next task for host managed_node1 30529 1726882653.83826: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882653.83829: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.83832: getting variables 30529 1726882653.83833: in VariableManager get_vars() 30529 1726882653.83845: Calling all_inventory to load vars for managed_node1 30529 1726882653.83847: Calling groups_inventory to load vars for managed_node1 30529 1726882653.83850: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.83856: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.83858: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.83861: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.85245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.87328: done with get_vars() 30529 1726882653.87359: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:37:33 -0400 (0:00:00.110) 0:01:07.900 ****** 30529 1726882653.87439: entering _queue_task() for managed_node1/include_tasks 30529 1726882653.88219: worker is 1 (out of 1 available) 30529 1726882653.88233: exiting _queue_task() for managed_node1/include_tasks 30529 1726882653.88248: done queuing things up, now waiting for results queue to drain 30529 1726882653.88249: waiting for pending results... 30529 1726882653.88699: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882653.89007: in run() - task 12673a56-9f93-b0f1-edc0-00000000176d 30529 1726882653.89022: variable 'ansible_search_path' from source: unknown 30529 1726882653.89026: variable 'ansible_search_path' from source: unknown 30529 1726882653.89062: calling self._execute() 30529 1726882653.89199: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882653.89203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882653.89206: variable 'omit' from source: magic vars 30529 1726882653.89954: variable 'ansible_distribution_major_version' from source: facts 30529 1726882653.89967: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882653.89972: _execute() done 30529 1726882653.89975: dumping result to json 30529 1726882653.89980: done dumping result, returning 30529 1726882653.89989: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-00000000176d] 30529 1726882653.90008: sending task result for task 12673a56-9f93-b0f1-edc0-00000000176d 30529 1726882653.90298: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000176d 30529 1726882653.90301: WORKER PROCESS EXITING 30529 1726882653.90356: no more pending results, returning what we have 30529 1726882653.90361: in VariableManager get_vars() 30529 1726882653.90410: Calling all_inventory to load vars for managed_node1 30529 1726882653.90413: Calling groups_inventory to load vars for managed_node1 30529 1726882653.90419: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.90436: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.90440: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.90444: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.93081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882653.96345: done with get_vars() 30529 1726882653.96364: variable 'ansible_search_path' from source: unknown 30529 1726882653.96366: variable 'ansible_search_path' from source: unknown 30529 1726882653.96610: we have included files to process 30529 1726882653.96611: generating all_blocks data 30529 1726882653.96613: done generating all_blocks data 30529 1726882653.96614: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882653.96615: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882653.96617: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882653.97079: done processing included file 30529 1726882653.97081: iterating over new_blocks loaded from include file 30529 1726882653.97083: in VariableManager get_vars() 30529 1726882653.97101: done with get_vars() 30529 1726882653.97103: filtering new block on tags 30529 1726882653.97139: done filtering new block on tags 30529 1726882653.97141: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882653.97147: extending task lists for all hosts with included blocks 30529 1726882653.97504: done extending task lists 30529 1726882653.97505: done processing included files 30529 1726882653.97506: results queue empty 30529 1726882653.97507: checking for any_errors_fatal 30529 1726882653.97510: done checking for any_errors_fatal 30529 1726882653.97510: checking for max_fail_percentage 30529 1726882653.97512: done checking for max_fail_percentage 30529 1726882653.97512: checking to see if all hosts have failed and the running result is not ok 30529 1726882653.97513: done checking to see if all hosts have failed 30529 1726882653.97514: getting the remaining hosts for this loop 30529 1726882653.97515: done getting the remaining hosts for this loop 30529 1726882653.97518: getting the next task for host managed_node1 30529 1726882653.97522: done getting next task for host managed_node1 30529 1726882653.97524: ^ task is: TASK: Gather current interface info 30529 1726882653.97528: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882653.97530: getting variables 30529 1726882653.97531: in VariableManager get_vars() 30529 1726882653.97542: Calling all_inventory to load vars for managed_node1 30529 1726882653.97544: Calling groups_inventory to load vars for managed_node1 30529 1726882653.97547: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882653.97552: Calling all_plugins_play to load vars for managed_node1 30529 1726882653.97554: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882653.97557: Calling groups_plugins_play to load vars for managed_node1 30529 1726882653.99027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.01939: done with get_vars() 30529 1726882654.01970: done getting variables 30529 1726882654.02322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:37:34 -0400 (0:00:00.149) 0:01:08.049 ****** 30529 1726882654.02354: entering _queue_task() for managed_node1/command 30529 1726882654.03402: worker is 1 (out of 1 available) 30529 1726882654.03415: exiting _queue_task() for managed_node1/command 30529 1726882654.03426: done queuing things up, now waiting for results queue to drain 30529 1726882654.03427: waiting for pending results... 30529 1726882654.04214: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882654.04220: in run() - task 12673a56-9f93-b0f1-edc0-0000000017a8 30529 1726882654.04223: variable 'ansible_search_path' from source: unknown 30529 1726882654.04226: variable 'ansible_search_path' from source: unknown 30529 1726882654.04228: calling self._execute() 30529 1726882654.04454: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.04459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.04462: variable 'omit' from source: magic vars 30529 1726882654.04954: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.04967: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.04973: variable 'omit' from source: magic vars 30529 1726882654.05301: variable 'omit' from source: magic vars 30529 1726882654.05304: variable 'omit' from source: magic vars 30529 1726882654.05313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882654.05351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882654.05373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882654.05391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.05613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.05644: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882654.05648: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.05651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.05760: Set connection var ansible_shell_executable to /bin/sh 30529 1726882654.05766: Set connection var ansible_pipelining to False 30529 1726882654.05768: Set connection var ansible_shell_type to sh 30529 1726882654.05779: Set connection var ansible_timeout to 10 30529 1726882654.05782: Set connection var ansible_connection to ssh 30529 1726882654.05787: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882654.06017: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.06021: variable 'ansible_connection' from source: unknown 30529 1726882654.06024: variable 'ansible_module_compression' from source: unknown 30529 1726882654.06027: variable 'ansible_shell_type' from source: unknown 30529 1726882654.06029: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.06032: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.06034: variable 'ansible_pipelining' from source: unknown 30529 1726882654.06036: variable 'ansible_timeout' from source: unknown 30529 1726882654.06041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.06498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882654.06502: variable 'omit' from source: magic vars 30529 1726882654.06505: starting attempt loop 30529 1726882654.06508: running the handler 30529 1726882654.06510: _low_level_execute_command(): starting 30529 1726882654.06512: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882654.08015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.08039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882654.08042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.08409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.09892: stdout chunk (state=3): >>>/root <<< 30529 1726882654.10010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.10049: stderr chunk (state=3): >>><<< 30529 1726882654.10074: stdout chunk (state=3): >>><<< 30529 1726882654.10099: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882654.10205: _low_level_execute_command(): starting 30529 1726882654.10212: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949 `" && echo ansible-tmp-1726882654.1011002-33756-236921772989949="` echo /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949 `" ) && sleep 0' 30529 1726882654.11311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882654.11510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.11538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.11662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.13532: stdout chunk (state=3): >>>ansible-tmp-1726882654.1011002-33756-236921772989949=/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949 <<< 30529 1726882654.13669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.13680: stdout chunk (state=3): >>><<< 30529 1726882654.13698: stderr chunk (state=3): >>><<< 30529 1726882654.13723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882654.1011002-33756-236921772989949=/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882654.13807: variable 'ansible_module_compression' from source: unknown 30529 1726882654.14000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882654.14004: variable 'ansible_facts' from source: unknown 30529 1726882654.14159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py 30529 1726882654.14514: Sending initial data 30529 1726882654.14524: Sent initial data (156 bytes) 30529 1726882654.15866: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882654.15889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.16015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.16050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.17596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882654.17619: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882654.17742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882654.17795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp2ks9d5il /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py <<< 30529 1726882654.17799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py" <<< 30529 1726882654.17954: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp2ks9d5il" to remote "/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py" <<< 30529 1726882654.19184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.19259: stderr chunk (state=3): >>><<< 30529 1726882654.19263: stdout chunk (state=3): >>><<< 30529 1726882654.19458: done transferring module to remote 30529 1726882654.19461: _low_level_execute_command(): starting 30529 1726882654.19464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/ /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py && sleep 0' 30529 1726882654.21120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.21124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.21214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.23046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.23075: stderr chunk (state=3): >>><<< 30529 1726882654.23272: stdout chunk (state=3): >>><<< 30529 1726882654.23276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882654.23284: _low_level_execute_command(): starting 30529 1726882654.23287: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/AnsiballZ_command.py && sleep 0' 30529 1726882654.24292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882654.24340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882654.24366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882654.24386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882654.24405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882654.24487: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.24516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882654.24534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.24559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.24723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.39916: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:34.395086", "end": "2024-09-20 21:37:34.398119", "delta": "0:00:00.003033", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882654.41332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882654.41338: stdout chunk (state=3): >>><<< 30529 1726882654.41347: stderr chunk (state=3): >>><<< 30529 1726882654.41363: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:34.395086", "end": "2024-09-20 21:37:34.398119", "delta": "0:00:00.003033", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882654.41395: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882654.41405: _low_level_execute_command(): starting 30529 1726882654.41410: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882654.1011002-33756-236921772989949/ > /dev/null 2>&1 && sleep 0' 30529 1726882654.41852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882654.41856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.41858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882654.41860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882654.41862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.41929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.41932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.41985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.43753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.43782: stderr chunk (state=3): >>><<< 30529 1726882654.43785: stdout chunk (state=3): >>><<< 30529 1726882654.43799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882654.43871: handler run complete 30529 1726882654.43874: Evaluated conditional (False): False 30529 1726882654.43876: attempt loop complete, returning result 30529 1726882654.43878: _execute() done 30529 1726882654.43881: dumping result to json 30529 1726882654.43883: done dumping result, returning 30529 1726882654.43885: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-0000000017a8] 30529 1726882654.43887: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017a8 30529 1726882654.43956: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017a8 30529 1726882654.43958: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003033", "end": "2024-09-20 21:37:34.398119", "rc": 0, "start": "2024-09-20 21:37:34.395086" } STDOUT: bonding_masters eth0 lo 30529 1726882654.44045: no more pending results, returning what we have 30529 1726882654.44049: results queue empty 30529 1726882654.44050: checking for any_errors_fatal 30529 1726882654.44051: done checking for any_errors_fatal 30529 1726882654.44052: checking for max_fail_percentage 30529 1726882654.44053: done checking for max_fail_percentage 30529 1726882654.44054: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.44055: done checking to see if all hosts have failed 30529 1726882654.44056: getting the remaining hosts for this loop 30529 1726882654.44058: done getting the remaining hosts for this loop 30529 1726882654.44061: getting the next task for host managed_node1 30529 1726882654.44071: done getting next task for host managed_node1 30529 1726882654.44073: ^ task is: TASK: Set current_interfaces 30529 1726882654.44079: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.44083: getting variables 30529 1726882654.44085: in VariableManager get_vars() 30529 1726882654.44125: Calling all_inventory to load vars for managed_node1 30529 1726882654.44128: Calling groups_inventory to load vars for managed_node1 30529 1726882654.44131: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.44141: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.44144: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.44146: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.45076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.46431: done with get_vars() 30529 1726882654.46452: done getting variables 30529 1726882654.46501: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:37:34 -0400 (0:00:00.441) 0:01:08.491 ****** 30529 1726882654.46525: entering _queue_task() for managed_node1/set_fact 30529 1726882654.46784: worker is 1 (out of 1 available) 30529 1726882654.46801: exiting _queue_task() for managed_node1/set_fact 30529 1726882654.46813: done queuing things up, now waiting for results queue to drain 30529 1726882654.46815: waiting for pending results... 30529 1726882654.46998: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882654.47081: in run() - task 12673a56-9f93-b0f1-edc0-0000000017a9 30529 1726882654.47096: variable 'ansible_search_path' from source: unknown 30529 1726882654.47100: variable 'ansible_search_path' from source: unknown 30529 1726882654.47127: calling self._execute() 30529 1726882654.47200: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.47203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.47212: variable 'omit' from source: magic vars 30529 1726882654.47477: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.47489: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.47496: variable 'omit' from source: magic vars 30529 1726882654.47530: variable 'omit' from source: magic vars 30529 1726882654.47604: variable '_current_interfaces' from source: set_fact 30529 1726882654.47651: variable 'omit' from source: magic vars 30529 1726882654.47681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882654.47714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882654.47730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882654.47745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.47755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.47783: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882654.47786: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.47791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.47862: Set connection var ansible_shell_executable to /bin/sh 30529 1726882654.47866: Set connection var ansible_pipelining to False 30529 1726882654.47868: Set connection var ansible_shell_type to sh 30529 1726882654.47877: Set connection var ansible_timeout to 10 30529 1726882654.47879: Set connection var ansible_connection to ssh 30529 1726882654.47884: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882654.47903: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.47906: variable 'ansible_connection' from source: unknown 30529 1726882654.47909: variable 'ansible_module_compression' from source: unknown 30529 1726882654.47911: variable 'ansible_shell_type' from source: unknown 30529 1726882654.47914: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.47916: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.47920: variable 'ansible_pipelining' from source: unknown 30529 1726882654.47922: variable 'ansible_timeout' from source: unknown 30529 1726882654.47924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.48024: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882654.48034: variable 'omit' from source: magic vars 30529 1726882654.48037: starting attempt loop 30529 1726882654.48040: running the handler 30529 1726882654.48052: handler run complete 30529 1726882654.48060: attempt loop complete, returning result 30529 1726882654.48063: _execute() done 30529 1726882654.48065: dumping result to json 30529 1726882654.48067: done dumping result, returning 30529 1726882654.48074: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-0000000017a9] 30529 1726882654.48076: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017a9 30529 1726882654.48156: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017a9 30529 1726882654.48159: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882654.48219: no more pending results, returning what we have 30529 1726882654.48222: results queue empty 30529 1726882654.48223: checking for any_errors_fatal 30529 1726882654.48232: done checking for any_errors_fatal 30529 1726882654.48233: checking for max_fail_percentage 30529 1726882654.48234: done checking for max_fail_percentage 30529 1726882654.48235: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.48236: done checking to see if all hosts have failed 30529 1726882654.48237: getting the remaining hosts for this loop 30529 1726882654.48238: done getting the remaining hosts for this loop 30529 1726882654.48242: getting the next task for host managed_node1 30529 1726882654.48252: done getting next task for host managed_node1 30529 1726882654.48254: ^ task is: TASK: Show current_interfaces 30529 1726882654.48258: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.48262: getting variables 30529 1726882654.48264: in VariableManager get_vars() 30529 1726882654.48304: Calling all_inventory to load vars for managed_node1 30529 1726882654.48307: Calling groups_inventory to load vars for managed_node1 30529 1726882654.48310: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.48320: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.48323: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.48325: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.49128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.49997: done with get_vars() 30529 1726882654.50012: done getting variables 30529 1726882654.50057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:37:34 -0400 (0:00:00.035) 0:01:08.526 ****** 30529 1726882654.50080: entering _queue_task() for managed_node1/debug 30529 1726882654.50322: worker is 1 (out of 1 available) 30529 1726882654.50335: exiting _queue_task() for managed_node1/debug 30529 1726882654.50347: done queuing things up, now waiting for results queue to drain 30529 1726882654.50348: waiting for pending results... 30529 1726882654.50532: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882654.50612: in run() - task 12673a56-9f93-b0f1-edc0-00000000176e 30529 1726882654.50623: variable 'ansible_search_path' from source: unknown 30529 1726882654.50626: variable 'ansible_search_path' from source: unknown 30529 1726882654.50654: calling self._execute() 30529 1726882654.50726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.50732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.50740: variable 'omit' from source: magic vars 30529 1726882654.51017: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.51025: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.51032: variable 'omit' from source: magic vars 30529 1726882654.51060: variable 'omit' from source: magic vars 30529 1726882654.51133: variable 'current_interfaces' from source: set_fact 30529 1726882654.51154: variable 'omit' from source: magic vars 30529 1726882654.51186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882654.51218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882654.51237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882654.51252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.51262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.51285: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882654.51288: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.51290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.51365: Set connection var ansible_shell_executable to /bin/sh 30529 1726882654.51369: Set connection var ansible_pipelining to False 30529 1726882654.51371: Set connection var ansible_shell_type to sh 30529 1726882654.51379: Set connection var ansible_timeout to 10 30529 1726882654.51382: Set connection var ansible_connection to ssh 30529 1726882654.51387: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882654.51408: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.51411: variable 'ansible_connection' from source: unknown 30529 1726882654.51414: variable 'ansible_module_compression' from source: unknown 30529 1726882654.51416: variable 'ansible_shell_type' from source: unknown 30529 1726882654.51419: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.51421: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.51423: variable 'ansible_pipelining' from source: unknown 30529 1726882654.51427: variable 'ansible_timeout' from source: unknown 30529 1726882654.51431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.51533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882654.51542: variable 'omit' from source: magic vars 30529 1726882654.51549: starting attempt loop 30529 1726882654.51552: running the handler 30529 1726882654.51589: handler run complete 30529 1726882654.51604: attempt loop complete, returning result 30529 1726882654.51607: _execute() done 30529 1726882654.51609: dumping result to json 30529 1726882654.51611: done dumping result, returning 30529 1726882654.51618: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-00000000176e] 30529 1726882654.51621: sending task result for task 12673a56-9f93-b0f1-edc0-00000000176e 30529 1726882654.51703: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000176e 30529 1726882654.51706: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882654.51749: no more pending results, returning what we have 30529 1726882654.51752: results queue empty 30529 1726882654.51753: checking for any_errors_fatal 30529 1726882654.51758: done checking for any_errors_fatal 30529 1726882654.51759: checking for max_fail_percentage 30529 1726882654.51761: done checking for max_fail_percentage 30529 1726882654.51762: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.51762: done checking to see if all hosts have failed 30529 1726882654.51763: getting the remaining hosts for this loop 30529 1726882654.51765: done getting the remaining hosts for this loop 30529 1726882654.51768: getting the next task for host managed_node1 30529 1726882654.51777: done getting next task for host managed_node1 30529 1726882654.51780: ^ task is: TASK: Setup 30529 1726882654.51783: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.51787: getting variables 30529 1726882654.51789: in VariableManager get_vars() 30529 1726882654.51828: Calling all_inventory to load vars for managed_node1 30529 1726882654.51831: Calling groups_inventory to load vars for managed_node1 30529 1726882654.51834: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.51844: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.51847: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.51849: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.52782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.53623: done with get_vars() 30529 1726882654.53641: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:37:34 -0400 (0:00:00.036) 0:01:08.563 ****** 30529 1726882654.53707: entering _queue_task() for managed_node1/include_tasks 30529 1726882654.53953: worker is 1 (out of 1 available) 30529 1726882654.53967: exiting _queue_task() for managed_node1/include_tasks 30529 1726882654.53980: done queuing things up, now waiting for results queue to drain 30529 1726882654.53981: waiting for pending results... 30529 1726882654.54166: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882654.54233: in run() - task 12673a56-9f93-b0f1-edc0-000000001747 30529 1726882654.54244: variable 'ansible_search_path' from source: unknown 30529 1726882654.54248: variable 'ansible_search_path' from source: unknown 30529 1726882654.54284: variable 'lsr_setup' from source: include params 30529 1726882654.54446: variable 'lsr_setup' from source: include params 30529 1726882654.54504: variable 'omit' from source: magic vars 30529 1726882654.54601: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.54607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.54617: variable 'omit' from source: magic vars 30529 1726882654.54786: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.54797: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.54804: variable 'item' from source: unknown 30529 1726882654.54852: variable 'item' from source: unknown 30529 1726882654.54878: variable 'item' from source: unknown 30529 1726882654.54925: variable 'item' from source: unknown 30529 1726882654.55046: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.55050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.55052: variable 'omit' from source: magic vars 30529 1726882654.55135: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.55139: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.55144: variable 'item' from source: unknown 30529 1726882654.55190: variable 'item' from source: unknown 30529 1726882654.55215: variable 'item' from source: unknown 30529 1726882654.55257: variable 'item' from source: unknown 30529 1726882654.55320: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.55331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.55338: variable 'omit' from source: magic vars 30529 1726882654.55437: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.55443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.55448: variable 'item' from source: unknown 30529 1726882654.55490: variable 'item' from source: unknown 30529 1726882654.55513: variable 'item' from source: unknown 30529 1726882654.55557: variable 'item' from source: unknown 30529 1726882654.55613: dumping result to json 30529 1726882654.55617: done dumping result, returning 30529 1726882654.55619: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-000000001747] 30529 1726882654.55622: sending task result for task 12673a56-9f93-b0f1-edc0-000000001747 30529 1726882654.55656: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001747 30529 1726882654.55659: WORKER PROCESS EXITING 30529 1726882654.55691: no more pending results, returning what we have 30529 1726882654.55697: in VariableManager get_vars() 30529 1726882654.55738: Calling all_inventory to load vars for managed_node1 30529 1726882654.55740: Calling groups_inventory to load vars for managed_node1 30529 1726882654.55743: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.55758: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.55761: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.55763: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.56597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.57452: done with get_vars() 30529 1726882654.57467: variable 'ansible_search_path' from source: unknown 30529 1726882654.57468: variable 'ansible_search_path' from source: unknown 30529 1726882654.57498: variable 'ansible_search_path' from source: unknown 30529 1726882654.57499: variable 'ansible_search_path' from source: unknown 30529 1726882654.57516: variable 'ansible_search_path' from source: unknown 30529 1726882654.57517: variable 'ansible_search_path' from source: unknown 30529 1726882654.57536: we have included files to process 30529 1726882654.57536: generating all_blocks data 30529 1726882654.57538: done generating all_blocks data 30529 1726882654.57541: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882654.57542: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882654.57543: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882654.57699: done processing included file 30529 1726882654.57700: iterating over new_blocks loaded from include file 30529 1726882654.57701: in VariableManager get_vars() 30529 1726882654.57711: done with get_vars() 30529 1726882654.57712: filtering new block on tags 30529 1726882654.57734: done filtering new block on tags 30529 1726882654.57736: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node1 => (item=tasks/create_bridge_profile.yml) 30529 1726882654.57740: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882654.57741: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882654.57743: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882654.57799: done processing included file 30529 1726882654.57801: iterating over new_blocks loaded from include file 30529 1726882654.57801: in VariableManager get_vars() 30529 1726882654.57811: done with get_vars() 30529 1726882654.57812: filtering new block on tags 30529 1726882654.57824: done filtering new block on tags 30529 1726882654.57825: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node1 => (item=tasks/activate_profile.yml) 30529 1726882654.57827: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882654.57828: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882654.57830: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30529 1726882654.57882: done processing included file 30529 1726882654.57884: iterating over new_blocks loaded from include file 30529 1726882654.57884: in VariableManager get_vars() 30529 1726882654.57896: done with get_vars() 30529 1726882654.57897: filtering new block on tags 30529 1726882654.57910: done filtering new block on tags 30529 1726882654.57912: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node1 => (item=tasks/remove_profile.yml) 30529 1726882654.57914: extending task lists for all hosts with included blocks 30529 1726882654.58364: done extending task lists 30529 1726882654.58370: done processing included files 30529 1726882654.58371: results queue empty 30529 1726882654.58371: checking for any_errors_fatal 30529 1726882654.58374: done checking for any_errors_fatal 30529 1726882654.58374: checking for max_fail_percentage 30529 1726882654.58375: done checking for max_fail_percentage 30529 1726882654.58375: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.58376: done checking to see if all hosts have failed 30529 1726882654.58377: getting the remaining hosts for this loop 30529 1726882654.58377: done getting the remaining hosts for this loop 30529 1726882654.58379: getting the next task for host managed_node1 30529 1726882654.58382: done getting next task for host managed_node1 30529 1726882654.58383: ^ task is: TASK: Include network role 30529 1726882654.58385: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.58387: getting variables 30529 1726882654.58387: in VariableManager get_vars() 30529 1726882654.58397: Calling all_inventory to load vars for managed_node1 30529 1726882654.58399: Calling groups_inventory to load vars for managed_node1 30529 1726882654.58401: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.58405: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.58406: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.58407: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.59020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.59857: done with get_vars() 30529 1726882654.59870: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:37:34 -0400 (0:00:00.062) 0:01:08.625 ****** 30529 1726882654.59928: entering _queue_task() for managed_node1/include_role 30529 1726882654.60185: worker is 1 (out of 1 available) 30529 1726882654.60199: exiting _queue_task() for managed_node1/include_role 30529 1726882654.60213: done queuing things up, now waiting for results queue to drain 30529 1726882654.60215: waiting for pending results... 30529 1726882654.60403: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882654.60472: in run() - task 12673a56-9f93-b0f1-edc0-0000000017d0 30529 1726882654.60484: variable 'ansible_search_path' from source: unknown 30529 1726882654.60488: variable 'ansible_search_path' from source: unknown 30529 1726882654.60519: calling self._execute() 30529 1726882654.60597: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.60601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.60611: variable 'omit' from source: magic vars 30529 1726882654.60882: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.60897: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.60905: _execute() done 30529 1726882654.60908: dumping result to json 30529 1726882654.60911: done dumping result, returning 30529 1726882654.60916: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-0000000017d0] 30529 1726882654.60921: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d0 30529 1726882654.61023: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d0 30529 1726882654.61026: WORKER PROCESS EXITING 30529 1726882654.61050: no more pending results, returning what we have 30529 1726882654.61055: in VariableManager get_vars() 30529 1726882654.61097: Calling all_inventory to load vars for managed_node1 30529 1726882654.61100: Calling groups_inventory to load vars for managed_node1 30529 1726882654.61103: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.61115: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.61118: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.61121: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.61982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.62837: done with get_vars() 30529 1726882654.62850: variable 'ansible_search_path' from source: unknown 30529 1726882654.62851: variable 'ansible_search_path' from source: unknown 30529 1726882654.62957: variable 'omit' from source: magic vars 30529 1726882654.62981: variable 'omit' from source: magic vars 30529 1726882654.62990: variable 'omit' from source: magic vars 30529 1726882654.62994: we have included files to process 30529 1726882654.62995: generating all_blocks data 30529 1726882654.62996: done generating all_blocks data 30529 1726882654.62997: processing included file: fedora.linux_system_roles.network 30529 1726882654.63010: in VariableManager get_vars() 30529 1726882654.63018: done with get_vars() 30529 1726882654.63035: in VariableManager get_vars() 30529 1726882654.63048: done with get_vars() 30529 1726882654.63073: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882654.63144: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882654.63192: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882654.63448: in VariableManager get_vars() 30529 1726882654.63461: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882654.64643: iterating over new_blocks loaded from include file 30529 1726882654.64644: in VariableManager get_vars() 30529 1726882654.64655: done with get_vars() 30529 1726882654.64656: filtering new block on tags 30529 1726882654.64809: done filtering new block on tags 30529 1726882654.64811: in VariableManager get_vars() 30529 1726882654.64821: done with get_vars() 30529 1726882654.64822: filtering new block on tags 30529 1726882654.64832: done filtering new block on tags 30529 1726882654.64833: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882654.64837: extending task lists for all hosts with included blocks 30529 1726882654.64927: done extending task lists 30529 1726882654.64928: done processing included files 30529 1726882654.64928: results queue empty 30529 1726882654.64929: checking for any_errors_fatal 30529 1726882654.64931: done checking for any_errors_fatal 30529 1726882654.64931: checking for max_fail_percentage 30529 1726882654.64932: done checking for max_fail_percentage 30529 1726882654.64932: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.64933: done checking to see if all hosts have failed 30529 1726882654.64933: getting the remaining hosts for this loop 30529 1726882654.64934: done getting the remaining hosts for this loop 30529 1726882654.64936: getting the next task for host managed_node1 30529 1726882654.64939: done getting next task for host managed_node1 30529 1726882654.64940: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882654.64942: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.64949: getting variables 30529 1726882654.64949: in VariableManager get_vars() 30529 1726882654.64957: Calling all_inventory to load vars for managed_node1 30529 1726882654.64959: Calling groups_inventory to load vars for managed_node1 30529 1726882654.64960: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.64963: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.64965: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.64966: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.65582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.66496: done with get_vars() 30529 1726882654.66510: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:34 -0400 (0:00:00.066) 0:01:08.691 ****** 30529 1726882654.66558: entering _queue_task() for managed_node1/include_tasks 30529 1726882654.66805: worker is 1 (out of 1 available) 30529 1726882654.66818: exiting _queue_task() for managed_node1/include_tasks 30529 1726882654.66831: done queuing things up, now waiting for results queue to drain 30529 1726882654.66832: waiting for pending results... 30529 1726882654.67010: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882654.67105: in run() - task 12673a56-9f93-b0f1-edc0-00000000183a 30529 1726882654.67115: variable 'ansible_search_path' from source: unknown 30529 1726882654.67118: variable 'ansible_search_path' from source: unknown 30529 1726882654.67144: calling self._execute() 30529 1726882654.67218: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.67222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.67231: variable 'omit' from source: magic vars 30529 1726882654.67502: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.67513: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.67519: _execute() done 30529 1726882654.67523: dumping result to json 30529 1726882654.67526: done dumping result, returning 30529 1726882654.67532: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-00000000183a] 30529 1726882654.67537: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183a 30529 1726882654.67622: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183a 30529 1726882654.67625: WORKER PROCESS EXITING 30529 1726882654.67668: no more pending results, returning what we have 30529 1726882654.67672: in VariableManager get_vars() 30529 1726882654.67719: Calling all_inventory to load vars for managed_node1 30529 1726882654.67722: Calling groups_inventory to load vars for managed_node1 30529 1726882654.67724: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.67736: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.67738: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.67740: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.68523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.69367: done with get_vars() 30529 1726882654.69383: variable 'ansible_search_path' from source: unknown 30529 1726882654.69384: variable 'ansible_search_path' from source: unknown 30529 1726882654.69412: we have included files to process 30529 1726882654.69413: generating all_blocks data 30529 1726882654.69414: done generating all_blocks data 30529 1726882654.69417: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882654.69418: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882654.69420: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882654.69802: done processing included file 30529 1726882654.69804: iterating over new_blocks loaded from include file 30529 1726882654.69805: in VariableManager get_vars() 30529 1726882654.69821: done with get_vars() 30529 1726882654.69823: filtering new block on tags 30529 1726882654.69841: done filtering new block on tags 30529 1726882654.69843: in VariableManager get_vars() 30529 1726882654.69858: done with get_vars() 30529 1726882654.69860: filtering new block on tags 30529 1726882654.69886: done filtering new block on tags 30529 1726882654.69887: in VariableManager get_vars() 30529 1726882654.69903: done with get_vars() 30529 1726882654.69905: filtering new block on tags 30529 1726882654.69928: done filtering new block on tags 30529 1726882654.69930: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882654.69934: extending task lists for all hosts with included blocks 30529 1726882654.70871: done extending task lists 30529 1726882654.70872: done processing included files 30529 1726882654.70873: results queue empty 30529 1726882654.70873: checking for any_errors_fatal 30529 1726882654.70876: done checking for any_errors_fatal 30529 1726882654.70877: checking for max_fail_percentage 30529 1726882654.70877: done checking for max_fail_percentage 30529 1726882654.70878: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.70878: done checking to see if all hosts have failed 30529 1726882654.70879: getting the remaining hosts for this loop 30529 1726882654.70880: done getting the remaining hosts for this loop 30529 1726882654.70881: getting the next task for host managed_node1 30529 1726882654.70885: done getting next task for host managed_node1 30529 1726882654.70887: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882654.70890: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.70899: getting variables 30529 1726882654.70900: in VariableManager get_vars() 30529 1726882654.70909: Calling all_inventory to load vars for managed_node1 30529 1726882654.70911: Calling groups_inventory to load vars for managed_node1 30529 1726882654.70912: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.70915: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.70917: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.70919: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.75205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.76046: done with get_vars() 30529 1726882654.76061: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:34 -0400 (0:00:00.095) 0:01:08.787 ****** 30529 1726882654.76115: entering _queue_task() for managed_node1/setup 30529 1726882654.76387: worker is 1 (out of 1 available) 30529 1726882654.76402: exiting _queue_task() for managed_node1/setup 30529 1726882654.76415: done queuing things up, now waiting for results queue to drain 30529 1726882654.76418: waiting for pending results... 30529 1726882654.76608: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882654.76724: in run() - task 12673a56-9f93-b0f1-edc0-000000001897 30529 1726882654.76740: variable 'ansible_search_path' from source: unknown 30529 1726882654.76747: variable 'ansible_search_path' from source: unknown 30529 1726882654.76773: calling self._execute() 30529 1726882654.76852: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.76858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.76863: variable 'omit' from source: magic vars 30529 1726882654.77128: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.77137: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.77287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882654.78803: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882654.78855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882654.78882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882654.78912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882654.78933: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882654.78992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882654.79014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882654.79035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882654.79061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882654.79072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882654.79111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882654.79134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882654.79147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882654.79172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882654.79183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882654.79297: variable '__network_required_facts' from source: role '' defaults 30529 1726882654.79300: variable 'ansible_facts' from source: unknown 30529 1726882654.79817: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882654.79820: when evaluation is False, skipping this task 30529 1726882654.79823: _execute() done 30529 1726882654.79825: dumping result to json 30529 1726882654.79827: done dumping result, returning 30529 1726882654.79830: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000001897] 30529 1726882654.79834: sending task result for task 12673a56-9f93-b0f1-edc0-000000001897 30529 1726882654.79913: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001897 30529 1726882654.79916: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882654.79962: no more pending results, returning what we have 30529 1726882654.79965: results queue empty 30529 1726882654.79967: checking for any_errors_fatal 30529 1726882654.79968: done checking for any_errors_fatal 30529 1726882654.79969: checking for max_fail_percentage 30529 1726882654.79970: done checking for max_fail_percentage 30529 1726882654.79971: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.79972: done checking to see if all hosts have failed 30529 1726882654.79972: getting the remaining hosts for this loop 30529 1726882654.79974: done getting the remaining hosts for this loop 30529 1726882654.79977: getting the next task for host managed_node1 30529 1726882654.79989: done getting next task for host managed_node1 30529 1726882654.79996: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882654.80003: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.80023: getting variables 30529 1726882654.80025: in VariableManager get_vars() 30529 1726882654.80065: Calling all_inventory to load vars for managed_node1 30529 1726882654.80067: Calling groups_inventory to load vars for managed_node1 30529 1726882654.80069: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.80078: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.80081: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.80089: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.80882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.82092: done with get_vars() 30529 1726882654.82116: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:34 -0400 (0:00:00.061) 0:01:08.848 ****** 30529 1726882654.82281: entering _queue_task() for managed_node1/stat 30529 1726882654.82565: worker is 1 (out of 1 available) 30529 1726882654.82580: exiting _queue_task() for managed_node1/stat 30529 1726882654.82597: done queuing things up, now waiting for results queue to drain 30529 1726882654.82599: waiting for pending results... 30529 1726882654.82773: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882654.82876: in run() - task 12673a56-9f93-b0f1-edc0-000000001899 30529 1726882654.82886: variable 'ansible_search_path' from source: unknown 30529 1726882654.82895: variable 'ansible_search_path' from source: unknown 30529 1726882654.82922: calling self._execute() 30529 1726882654.82988: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.82996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.83003: variable 'omit' from source: magic vars 30529 1726882654.83262: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.83272: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.83386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882654.83567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882654.83602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882654.83655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882654.83685: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882654.83748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882654.83765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882654.83784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882654.83805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882654.83868: variable '__network_is_ostree' from source: set_fact 30529 1726882654.83872: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882654.83875: when evaluation is False, skipping this task 30529 1726882654.83879: _execute() done 30529 1726882654.83881: dumping result to json 30529 1726882654.83886: done dumping result, returning 30529 1726882654.83898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000001899] 30529 1726882654.83903: sending task result for task 12673a56-9f93-b0f1-edc0-000000001899 30529 1726882654.83977: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001899 30529 1726882654.83979: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882654.84056: no more pending results, returning what we have 30529 1726882654.84059: results queue empty 30529 1726882654.84060: checking for any_errors_fatal 30529 1726882654.84063: done checking for any_errors_fatal 30529 1726882654.84064: checking for max_fail_percentage 30529 1726882654.84065: done checking for max_fail_percentage 30529 1726882654.84066: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.84067: done checking to see if all hosts have failed 30529 1726882654.84068: getting the remaining hosts for this loop 30529 1726882654.84069: done getting the remaining hosts for this loop 30529 1726882654.84072: getting the next task for host managed_node1 30529 1726882654.84079: done getting next task for host managed_node1 30529 1726882654.84081: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882654.84086: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.84109: getting variables 30529 1726882654.84110: in VariableManager get_vars() 30529 1726882654.84141: Calling all_inventory to load vars for managed_node1 30529 1726882654.84143: Calling groups_inventory to load vars for managed_node1 30529 1726882654.84145: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.84153: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.84155: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.84158: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.85532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.87115: done with get_vars() 30529 1726882654.87134: done getting variables 30529 1726882654.87185: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:34 -0400 (0:00:00.049) 0:01:08.898 ****** 30529 1726882654.87226: entering _queue_task() for managed_node1/set_fact 30529 1726882654.87482: worker is 1 (out of 1 available) 30529 1726882654.87497: exiting _queue_task() for managed_node1/set_fact 30529 1726882654.87511: done queuing things up, now waiting for results queue to drain 30529 1726882654.87512: waiting for pending results... 30529 1726882654.87913: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882654.87941: in run() - task 12673a56-9f93-b0f1-edc0-00000000189a 30529 1726882654.87964: variable 'ansible_search_path' from source: unknown 30529 1726882654.87972: variable 'ansible_search_path' from source: unknown 30529 1726882654.88018: calling self._execute() 30529 1726882654.88125: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.88137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.88150: variable 'omit' from source: magic vars 30529 1726882654.88531: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.88546: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.88712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882654.89010: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882654.89059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882654.89143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882654.89186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882654.89318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882654.89325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882654.89357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882654.89388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882654.89494: variable '__network_is_ostree' from source: set_fact 30529 1726882654.89535: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882654.89538: when evaluation is False, skipping this task 30529 1726882654.89541: _execute() done 30529 1726882654.89543: dumping result to json 30529 1726882654.89545: done dumping result, returning 30529 1726882654.89548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-00000000189a] 30529 1726882654.89644: sending task result for task 12673a56-9f93-b0f1-edc0-00000000189a 30529 1726882654.89718: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000189a 30529 1726882654.89721: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882654.89799: no more pending results, returning what we have 30529 1726882654.89803: results queue empty 30529 1726882654.89805: checking for any_errors_fatal 30529 1726882654.89813: done checking for any_errors_fatal 30529 1726882654.89814: checking for max_fail_percentage 30529 1726882654.89816: done checking for max_fail_percentage 30529 1726882654.89817: checking to see if all hosts have failed and the running result is not ok 30529 1726882654.89818: done checking to see if all hosts have failed 30529 1726882654.89819: getting the remaining hosts for this loop 30529 1726882654.89820: done getting the remaining hosts for this loop 30529 1726882654.89824: getting the next task for host managed_node1 30529 1726882654.89836: done getting next task for host managed_node1 30529 1726882654.89840: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882654.89848: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882654.89870: getting variables 30529 1726882654.89872: in VariableManager get_vars() 30529 1726882654.89917: Calling all_inventory to load vars for managed_node1 30529 1726882654.89920: Calling groups_inventory to load vars for managed_node1 30529 1726882654.89923: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882654.89934: Calling all_plugins_play to load vars for managed_node1 30529 1726882654.89938: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882654.89941: Calling groups_plugins_play to load vars for managed_node1 30529 1726882654.91515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882654.93347: done with get_vars() 30529 1726882654.93367: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:34 -0400 (0:00:00.062) 0:01:08.960 ****** 30529 1726882654.93464: entering _queue_task() for managed_node1/service_facts 30529 1726882654.93736: worker is 1 (out of 1 available) 30529 1726882654.93750: exiting _queue_task() for managed_node1/service_facts 30529 1726882654.93767: done queuing things up, now waiting for results queue to drain 30529 1726882654.93768: waiting for pending results... 30529 1726882654.94096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882654.94439: in run() - task 12673a56-9f93-b0f1-edc0-00000000189c 30529 1726882654.94443: variable 'ansible_search_path' from source: unknown 30529 1726882654.94446: variable 'ansible_search_path' from source: unknown 30529 1726882654.94449: calling self._execute() 30529 1726882654.94509: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.94521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.94536: variable 'omit' from source: magic vars 30529 1726882654.95025: variable 'ansible_distribution_major_version' from source: facts 30529 1726882654.95041: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882654.95053: variable 'omit' from source: magic vars 30529 1726882654.95149: variable 'omit' from source: magic vars 30529 1726882654.95235: variable 'omit' from source: magic vars 30529 1726882654.95278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882654.95342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882654.95383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882654.95415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.95526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882654.95530: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882654.95533: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.95535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.95709: Set connection var ansible_shell_executable to /bin/sh 30529 1726882654.95721: Set connection var ansible_pipelining to False 30529 1726882654.95729: Set connection var ansible_shell_type to sh 30529 1726882654.95758: Set connection var ansible_timeout to 10 30529 1726882654.95761: Set connection var ansible_connection to ssh 30529 1726882654.95852: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882654.95855: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.95857: variable 'ansible_connection' from source: unknown 30529 1726882654.95866: variable 'ansible_module_compression' from source: unknown 30529 1726882654.95869: variable 'ansible_shell_type' from source: unknown 30529 1726882654.95871: variable 'ansible_shell_executable' from source: unknown 30529 1726882654.95874: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882654.95876: variable 'ansible_pipelining' from source: unknown 30529 1726882654.95878: variable 'ansible_timeout' from source: unknown 30529 1726882654.95880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882654.96044: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882654.96058: variable 'omit' from source: magic vars 30529 1726882654.96074: starting attempt loop 30529 1726882654.96085: running the handler 30529 1726882654.96107: _low_level_execute_command(): starting 30529 1726882654.96119: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882654.96819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.96864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882654.96887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.96897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.96977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882654.98663: stdout chunk (state=3): >>>/root <<< 30529 1726882654.98834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882654.98838: stdout chunk (state=3): >>><<< 30529 1726882654.98841: stderr chunk (state=3): >>><<< 30529 1726882654.98971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882654.98975: _low_level_execute_command(): starting 30529 1726882654.98977: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954 `" && echo ansible-tmp-1726882654.9886348-33797-20454423321954="` echo /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954 `" ) && sleep 0' 30529 1726882654.99524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882654.99527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882654.99530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882654.99541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882654.99547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882654.99550: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882654.99560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.99563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882654.99565: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882654.99567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882654.99617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882654.99621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882654.99624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882654.99626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882654.99629: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882654.99632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882654.99699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882654.99716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882654.99746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882654.99800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882655.01677: stdout chunk (state=3): >>>ansible-tmp-1726882654.9886348-33797-20454423321954=/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954 <<< 30529 1726882655.01808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882655.01838: stderr chunk (state=3): >>><<< 30529 1726882655.01860: stdout chunk (state=3): >>><<< 30529 1726882655.01873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882654.9886348-33797-20454423321954=/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882655.02098: variable 'ansible_module_compression' from source: unknown 30529 1726882655.02102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882655.02104: variable 'ansible_facts' from source: unknown 30529 1726882655.02112: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py 30529 1726882655.02235: Sending initial data 30529 1726882655.02339: Sent initial data (161 bytes) 30529 1726882655.02881: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882655.02902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882655.02919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882655.03004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882655.03042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882655.03060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882655.03082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882655.03168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882655.04728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882655.04787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882655.04840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpbub6c4cy /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py <<< 30529 1726882655.04846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py" <<< 30529 1726882655.04891: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpbub6c4cy" to remote "/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py" <<< 30529 1726882655.05635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882655.05802: stderr chunk (state=3): >>><<< 30529 1726882655.05805: stdout chunk (state=3): >>><<< 30529 1726882655.05807: done transferring module to remote 30529 1726882655.05810: _low_level_execute_command(): starting 30529 1726882655.05812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/ /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py && sleep 0' 30529 1726882655.06362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882655.06376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882655.06390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882655.06414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882655.06432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882655.06443: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882655.06540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882655.06569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882655.06639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882655.08391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882655.08437: stderr chunk (state=3): >>><<< 30529 1726882655.08441: stdout chunk (state=3): >>><<< 30529 1726882655.08453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882655.08457: _low_level_execute_command(): starting 30529 1726882655.08460: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/AnsiballZ_service_facts.py && sleep 0' 30529 1726882655.08971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882655.08977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882655.08999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882655.09003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882655.09014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882655.09065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882655.09068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882655.09123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.60384: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882656.60409: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30529 1726882656.60429: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30529 1726882656.60455: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source<<< 30529 1726882656.60466: stdout chunk (state=3): >>>": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"<<< 30529 1726882656.60474: stdout chunk (state=3): >>>}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882656.61933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882656.61964: stderr chunk (state=3): >>><<< 30529 1726882656.61967: stdout chunk (state=3): >>><<< 30529 1726882656.61995: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882656.62462: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882656.62466: _low_level_execute_command(): starting 30529 1726882656.62471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882654.9886348-33797-20454423321954/ > /dev/null 2>&1 && sleep 0' 30529 1726882656.62930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.62934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.62936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.62938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.62987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882656.62996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882656.62998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.63041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.64789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882656.64818: stderr chunk (state=3): >>><<< 30529 1726882656.64822: stdout chunk (state=3): >>><<< 30529 1726882656.64833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882656.64838: handler run complete 30529 1726882656.64952: variable 'ansible_facts' from source: unknown 30529 1726882656.65051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882656.65329: variable 'ansible_facts' from source: unknown 30529 1726882656.65409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882656.65522: attempt loop complete, returning result 30529 1726882656.65527: _execute() done 30529 1726882656.65530: dumping result to json 30529 1726882656.65563: done dumping result, returning 30529 1726882656.65573: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-00000000189c] 30529 1726882656.65576: sending task result for task 12673a56-9f93-b0f1-edc0-00000000189c ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882656.66208: no more pending results, returning what we have 30529 1726882656.66212: results queue empty 30529 1726882656.66213: checking for any_errors_fatal 30529 1726882656.66217: done checking for any_errors_fatal 30529 1726882656.66218: checking for max_fail_percentage 30529 1726882656.66219: done checking for max_fail_percentage 30529 1726882656.66220: checking to see if all hosts have failed and the running result is not ok 30529 1726882656.66221: done checking to see if all hosts have failed 30529 1726882656.66222: getting the remaining hosts for this loop 30529 1726882656.66223: done getting the remaining hosts for this loop 30529 1726882656.66226: getting the next task for host managed_node1 30529 1726882656.66233: done getting next task for host managed_node1 30529 1726882656.66235: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882656.66240: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882656.66250: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000189c 30529 1726882656.66253: WORKER PROCESS EXITING 30529 1726882656.66259: getting variables 30529 1726882656.66260: in VariableManager get_vars() 30529 1726882656.66284: Calling all_inventory to load vars for managed_node1 30529 1726882656.66286: Calling groups_inventory to load vars for managed_node1 30529 1726882656.66288: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882656.66298: Calling all_plugins_play to load vars for managed_node1 30529 1726882656.66300: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882656.66302: Calling groups_plugins_play to load vars for managed_node1 30529 1726882656.67048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882656.68001: done with get_vars() 30529 1726882656.68018: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:36 -0400 (0:00:01.746) 0:01:10.707 ****** 30529 1726882656.68088: entering _queue_task() for managed_node1/package_facts 30529 1726882656.68331: worker is 1 (out of 1 available) 30529 1726882656.68345: exiting _queue_task() for managed_node1/package_facts 30529 1726882656.68358: done queuing things up, now waiting for results queue to drain 30529 1726882656.68360: waiting for pending results... 30529 1726882656.68540: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882656.68636: in run() - task 12673a56-9f93-b0f1-edc0-00000000189d 30529 1726882656.68649: variable 'ansible_search_path' from source: unknown 30529 1726882656.68652: variable 'ansible_search_path' from source: unknown 30529 1726882656.68678: calling self._execute() 30529 1726882656.68754: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882656.68758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882656.68766: variable 'omit' from source: magic vars 30529 1726882656.69045: variable 'ansible_distribution_major_version' from source: facts 30529 1726882656.69055: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882656.69060: variable 'omit' from source: magic vars 30529 1726882656.69118: variable 'omit' from source: magic vars 30529 1726882656.69144: variable 'omit' from source: magic vars 30529 1726882656.69175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882656.69203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882656.69221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882656.69234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882656.69249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882656.69271: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882656.69274: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882656.69277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882656.69355: Set connection var ansible_shell_executable to /bin/sh 30529 1726882656.69358: Set connection var ansible_pipelining to False 30529 1726882656.69361: Set connection var ansible_shell_type to sh 30529 1726882656.69366: Set connection var ansible_timeout to 10 30529 1726882656.69369: Set connection var ansible_connection to ssh 30529 1726882656.69374: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882656.69392: variable 'ansible_shell_executable' from source: unknown 30529 1726882656.69397: variable 'ansible_connection' from source: unknown 30529 1726882656.69400: variable 'ansible_module_compression' from source: unknown 30529 1726882656.69403: variable 'ansible_shell_type' from source: unknown 30529 1726882656.69405: variable 'ansible_shell_executable' from source: unknown 30529 1726882656.69408: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882656.69410: variable 'ansible_pipelining' from source: unknown 30529 1726882656.69412: variable 'ansible_timeout' from source: unknown 30529 1726882656.69414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882656.69552: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882656.69561: variable 'omit' from source: magic vars 30529 1726882656.69571: starting attempt loop 30529 1726882656.69574: running the handler 30529 1726882656.69583: _low_level_execute_command(): starting 30529 1726882656.69592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882656.70115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.70120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.70124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882656.70126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.70128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.70175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882656.70178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882656.70181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.70230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.71778: stdout chunk (state=3): >>>/root <<< 30529 1726882656.71882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882656.71912: stderr chunk (state=3): >>><<< 30529 1726882656.71916: stdout chunk (state=3): >>><<< 30529 1726882656.71936: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882656.71946: _low_level_execute_command(): starting 30529 1726882656.71952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959 `" && echo ansible-tmp-1726882656.7193398-33859-148909811715959="` echo /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959 `" ) && sleep 0' 30529 1726882656.72355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882656.72377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882656.72388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.72438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882656.72441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.72503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.74327: stdout chunk (state=3): >>>ansible-tmp-1726882656.7193398-33859-148909811715959=/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959 <<< 30529 1726882656.74436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882656.74462: stderr chunk (state=3): >>><<< 30529 1726882656.74465: stdout chunk (state=3): >>><<< 30529 1726882656.74476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882656.7193398-33859-148909811715959=/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882656.74515: variable 'ansible_module_compression' from source: unknown 30529 1726882656.74549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882656.74605: variable 'ansible_facts' from source: unknown 30529 1726882656.74721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py 30529 1726882656.74818: Sending initial data 30529 1726882656.74822: Sent initial data (162 bytes) 30529 1726882656.75250: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.75253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.75255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882656.75258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882656.75260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.75299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882656.75315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.75358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.76874: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882656.76920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882656.76971: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpr_h4mpk3 /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py <<< 30529 1726882656.76975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py" <<< 30529 1726882656.77020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpr_h4mpk3" to remote "/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py" <<< 30529 1726882656.78186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882656.78225: stderr chunk (state=3): >>><<< 30529 1726882656.78228: stdout chunk (state=3): >>><<< 30529 1726882656.78251: done transferring module to remote 30529 1726882656.78260: _low_level_execute_command(): starting 30529 1726882656.78263: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/ /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py && sleep 0' 30529 1726882656.78674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882656.78677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882656.78679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.78681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882656.78683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882656.78741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882656.78744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.78780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882656.80530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882656.80534: stdout chunk (state=3): >>><<< 30529 1726882656.80536: stderr chunk (state=3): >>><<< 30529 1726882656.80599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882656.80602: _low_level_execute_command(): starting 30529 1726882656.80605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/AnsiballZ_package_facts.py && sleep 0' 30529 1726882656.81228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882656.81249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882656.81321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882657.25049: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30529 1726882657.25135: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30529 1726882657.25244: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30529 1726882657.25316: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882657.27435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882657.27439: stdout chunk (state=3): >>><<< 30529 1726882657.27441: stderr chunk (state=3): >>><<< 30529 1726882657.27577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882657.32010: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882657.32110: _low_level_execute_command(): starting 30529 1726882657.32114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882656.7193398-33859-148909811715959/ > /dev/null 2>&1 && sleep 0' 30529 1726882657.32622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882657.32637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882657.32653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882657.32669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882657.32685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882657.32699: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882657.32804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882657.32820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882657.32839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882657.32909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882657.35099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882657.35102: stdout chunk (state=3): >>><<< 30529 1726882657.35105: stderr chunk (state=3): >>><<< 30529 1726882657.35107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882657.35110: handler run complete 30529 1726882657.35937: variable 'ansible_facts' from source: unknown 30529 1726882657.36428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.39351: variable 'ansible_facts' from source: unknown 30529 1726882657.39783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.40442: attempt loop complete, returning result 30529 1726882657.40459: _execute() done 30529 1726882657.40467: dumping result to json 30529 1726882657.40686: done dumping result, returning 30529 1726882657.40702: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-00000000189d] 30529 1726882657.40717: sending task result for task 12673a56-9f93-b0f1-edc0-00000000189d 30529 1726882657.44069: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000189d 30529 1726882657.44072: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882657.44225: no more pending results, returning what we have 30529 1726882657.44228: results queue empty 30529 1726882657.44229: checking for any_errors_fatal 30529 1726882657.44234: done checking for any_errors_fatal 30529 1726882657.44234: checking for max_fail_percentage 30529 1726882657.44236: done checking for max_fail_percentage 30529 1726882657.44237: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.44238: done checking to see if all hosts have failed 30529 1726882657.44238: getting the remaining hosts for this loop 30529 1726882657.44240: done getting the remaining hosts for this loop 30529 1726882657.44243: getting the next task for host managed_node1 30529 1726882657.44251: done getting next task for host managed_node1 30529 1726882657.44255: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882657.44260: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.44271: getting variables 30529 1726882657.44273: in VariableManager get_vars() 30529 1726882657.44413: Calling all_inventory to load vars for managed_node1 30529 1726882657.44417: Calling groups_inventory to load vars for managed_node1 30529 1726882657.44419: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.44429: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.44431: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.44434: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.45724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.48385: done with get_vars() 30529 1726882657.48422: done getting variables 30529 1726882657.48492: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:37 -0400 (0:00:00.804) 0:01:11.511 ****** 30529 1726882657.48536: entering _queue_task() for managed_node1/debug 30529 1726882657.48898: worker is 1 (out of 1 available) 30529 1726882657.48912: exiting _queue_task() for managed_node1/debug 30529 1726882657.48924: done queuing things up, now waiting for results queue to drain 30529 1726882657.48925: waiting for pending results... 30529 1726882657.49304: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882657.49376: in run() - task 12673a56-9f93-b0f1-edc0-00000000183b 30529 1726882657.49406: variable 'ansible_search_path' from source: unknown 30529 1726882657.49413: variable 'ansible_search_path' from source: unknown 30529 1726882657.49452: calling self._execute() 30529 1726882657.49555: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.49565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.49577: variable 'omit' from source: magic vars 30529 1726882657.49972: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.49988: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.50053: variable 'omit' from source: magic vars 30529 1726882657.50085: variable 'omit' from source: magic vars 30529 1726882657.50191: variable 'network_provider' from source: set_fact 30529 1726882657.50218: variable 'omit' from source: magic vars 30529 1726882657.50268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882657.50315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882657.50339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882657.50360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882657.50490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882657.50495: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882657.50497: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.50500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.50548: Set connection var ansible_shell_executable to /bin/sh 30529 1726882657.50558: Set connection var ansible_pipelining to False 30529 1726882657.50565: Set connection var ansible_shell_type to sh 30529 1726882657.50579: Set connection var ansible_timeout to 10 30529 1726882657.50586: Set connection var ansible_connection to ssh 30529 1726882657.50605: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882657.50631: variable 'ansible_shell_executable' from source: unknown 30529 1726882657.50639: variable 'ansible_connection' from source: unknown 30529 1726882657.50647: variable 'ansible_module_compression' from source: unknown 30529 1726882657.50653: variable 'ansible_shell_type' from source: unknown 30529 1726882657.50660: variable 'ansible_shell_executable' from source: unknown 30529 1726882657.50666: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.50674: variable 'ansible_pipelining' from source: unknown 30529 1726882657.50680: variable 'ansible_timeout' from source: unknown 30529 1726882657.50688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.50831: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882657.50847: variable 'omit' from source: magic vars 30529 1726882657.50854: starting attempt loop 30529 1726882657.50860: running the handler 30529 1726882657.50906: handler run complete 30529 1726882657.50930: attempt loop complete, returning result 30529 1726882657.50936: _execute() done 30529 1726882657.50940: dumping result to json 30529 1726882657.50998: done dumping result, returning 30529 1726882657.51002: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-00000000183b] 30529 1726882657.51004: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183b ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882657.51212: no more pending results, returning what we have 30529 1726882657.51217: results queue empty 30529 1726882657.51218: checking for any_errors_fatal 30529 1726882657.51228: done checking for any_errors_fatal 30529 1726882657.51229: checking for max_fail_percentage 30529 1726882657.51231: done checking for max_fail_percentage 30529 1726882657.51232: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.51233: done checking to see if all hosts have failed 30529 1726882657.51234: getting the remaining hosts for this loop 30529 1726882657.51236: done getting the remaining hosts for this loop 30529 1726882657.51240: getting the next task for host managed_node1 30529 1726882657.51252: done getting next task for host managed_node1 30529 1726882657.51256: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882657.51263: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.51278: getting variables 30529 1726882657.51280: in VariableManager get_vars() 30529 1726882657.51552: Calling all_inventory to load vars for managed_node1 30529 1726882657.51555: Calling groups_inventory to load vars for managed_node1 30529 1726882657.51557: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.51568: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.51571: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.51575: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.52237: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183b 30529 1726882657.52240: WORKER PROCESS EXITING 30529 1726882657.53260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.54881: done with get_vars() 30529 1726882657.54911: done getting variables 30529 1726882657.54969: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:37 -0400 (0:00:00.064) 0:01:11.576 ****** 30529 1726882657.55016: entering _queue_task() for managed_node1/fail 30529 1726882657.55379: worker is 1 (out of 1 available) 30529 1726882657.55600: exiting _queue_task() for managed_node1/fail 30529 1726882657.55613: done queuing things up, now waiting for results queue to drain 30529 1726882657.55615: waiting for pending results... 30529 1726882657.55733: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882657.55917: in run() - task 12673a56-9f93-b0f1-edc0-00000000183c 30529 1726882657.55935: variable 'ansible_search_path' from source: unknown 30529 1726882657.55942: variable 'ansible_search_path' from source: unknown 30529 1726882657.55984: calling self._execute() 30529 1726882657.56085: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.56101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.56114: variable 'omit' from source: magic vars 30529 1726882657.56542: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.56561: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.56702: variable 'network_state' from source: role '' defaults 30529 1726882657.56721: Evaluated conditional (network_state != {}): False 30529 1726882657.56733: when evaluation is False, skipping this task 30529 1726882657.56741: _execute() done 30529 1726882657.56749: dumping result to json 30529 1726882657.56755: done dumping result, returning 30529 1726882657.56767: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-00000000183c] 30529 1726882657.56778: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882657.57047: no more pending results, returning what we have 30529 1726882657.57052: results queue empty 30529 1726882657.57053: checking for any_errors_fatal 30529 1726882657.57061: done checking for any_errors_fatal 30529 1726882657.57062: checking for max_fail_percentage 30529 1726882657.57064: done checking for max_fail_percentage 30529 1726882657.57065: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.57066: done checking to see if all hosts have failed 30529 1726882657.57067: getting the remaining hosts for this loop 30529 1726882657.57069: done getting the remaining hosts for this loop 30529 1726882657.57073: getting the next task for host managed_node1 30529 1726882657.57083: done getting next task for host managed_node1 30529 1726882657.57087: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882657.57098: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.57125: getting variables 30529 1726882657.57127: in VariableManager get_vars() 30529 1726882657.57176: Calling all_inventory to load vars for managed_node1 30529 1726882657.57179: Calling groups_inventory to load vars for managed_node1 30529 1726882657.57182: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.57402: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.57406: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.57410: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.58106: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183c 30529 1726882657.58110: WORKER PROCESS EXITING 30529 1726882657.58837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.60601: done with get_vars() 30529 1726882657.60624: done getting variables 30529 1726882657.60683: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:37 -0400 (0:00:00.057) 0:01:11.633 ****** 30529 1726882657.60723: entering _queue_task() for managed_node1/fail 30529 1726882657.61300: worker is 1 (out of 1 available) 30529 1726882657.61310: exiting _queue_task() for managed_node1/fail 30529 1726882657.61320: done queuing things up, now waiting for results queue to drain 30529 1726882657.61322: waiting for pending results... 30529 1726882657.61433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882657.61599: in run() - task 12673a56-9f93-b0f1-edc0-00000000183d 30529 1726882657.61621: variable 'ansible_search_path' from source: unknown 30529 1726882657.61630: variable 'ansible_search_path' from source: unknown 30529 1726882657.61673: calling self._execute() 30529 1726882657.61777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.61788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.61808: variable 'omit' from source: magic vars 30529 1726882657.62184: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.62209: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.62338: variable 'network_state' from source: role '' defaults 30529 1726882657.62353: Evaluated conditional (network_state != {}): False 30529 1726882657.62361: when evaluation is False, skipping this task 30529 1726882657.62369: _execute() done 30529 1726882657.62376: dumping result to json 30529 1726882657.62383: done dumping result, returning 30529 1726882657.62400: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-00000000183d] 30529 1726882657.62410: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882657.62572: no more pending results, returning what we have 30529 1726882657.62576: results queue empty 30529 1726882657.62577: checking for any_errors_fatal 30529 1726882657.62587: done checking for any_errors_fatal 30529 1726882657.62587: checking for max_fail_percentage 30529 1726882657.62592: done checking for max_fail_percentage 30529 1726882657.62595: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.62596: done checking to see if all hosts have failed 30529 1726882657.62597: getting the remaining hosts for this loop 30529 1726882657.62599: done getting the remaining hosts for this loop 30529 1726882657.62603: getting the next task for host managed_node1 30529 1726882657.62611: done getting next task for host managed_node1 30529 1726882657.62616: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882657.62622: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.62646: getting variables 30529 1726882657.62649: in VariableManager get_vars() 30529 1726882657.62797: Calling all_inventory to load vars for managed_node1 30529 1726882657.62801: Calling groups_inventory to load vars for managed_node1 30529 1726882657.62804: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.62810: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183d 30529 1726882657.62813: WORKER PROCESS EXITING 30529 1726882657.62827: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.62830: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.62833: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.64377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.65969: done with get_vars() 30529 1726882657.65995: done getting variables 30529 1726882657.66054: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:37 -0400 (0:00:00.053) 0:01:11.686 ****** 30529 1726882657.66091: entering _queue_task() for managed_node1/fail 30529 1726882657.66517: worker is 1 (out of 1 available) 30529 1726882657.66528: exiting _queue_task() for managed_node1/fail 30529 1726882657.66540: done queuing things up, now waiting for results queue to drain 30529 1726882657.66541: waiting for pending results... 30529 1726882657.66750: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882657.66912: in run() - task 12673a56-9f93-b0f1-edc0-00000000183e 30529 1726882657.66931: variable 'ansible_search_path' from source: unknown 30529 1726882657.66938: variable 'ansible_search_path' from source: unknown 30529 1726882657.66977: calling self._execute() 30529 1726882657.67072: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.67082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.67106: variable 'omit' from source: magic vars 30529 1726882657.67473: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.67698: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.67701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882657.69955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882657.70030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882657.70073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882657.70120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882657.70154: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882657.70248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.70663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.70700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.70746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.70772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.70880: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.70907: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882657.71028: variable 'ansible_distribution' from source: facts 30529 1726882657.71038: variable '__network_rh_distros' from source: role '' defaults 30529 1726882657.71047: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882657.71214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.71233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.71251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.71277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.71289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.71326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.71343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.71361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.71386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.71403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.71431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.71447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.71465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.71491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.71506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.71690: variable 'network_connections' from source: include params 30529 1726882657.71703: variable 'interface' from source: play vars 30529 1726882657.71750: variable 'interface' from source: play vars 30529 1726882657.71759: variable 'network_state' from source: role '' defaults 30529 1726882657.71817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882657.71929: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882657.71956: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882657.71978: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882657.72004: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882657.72035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882657.72053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882657.72074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.72092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882657.72122: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882657.72125: when evaluation is False, skipping this task 30529 1726882657.72128: _execute() done 30529 1726882657.72131: dumping result to json 30529 1726882657.72133: done dumping result, returning 30529 1726882657.72138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-00000000183e] 30529 1726882657.72142: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183e 30529 1726882657.72231: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183e 30529 1726882657.72233: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882657.72283: no more pending results, returning what we have 30529 1726882657.72287: results queue empty 30529 1726882657.72288: checking for any_errors_fatal 30529 1726882657.72294: done checking for any_errors_fatal 30529 1726882657.72295: checking for max_fail_percentage 30529 1726882657.72297: done checking for max_fail_percentage 30529 1726882657.72298: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.72298: done checking to see if all hosts have failed 30529 1726882657.72299: getting the remaining hosts for this loop 30529 1726882657.72301: done getting the remaining hosts for this loop 30529 1726882657.72305: getting the next task for host managed_node1 30529 1726882657.72312: done getting next task for host managed_node1 30529 1726882657.72316: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882657.72321: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.72340: getting variables 30529 1726882657.72342: in VariableManager get_vars() 30529 1726882657.72382: Calling all_inventory to load vars for managed_node1 30529 1726882657.72385: Calling groups_inventory to load vars for managed_node1 30529 1726882657.72387: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.72403: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.72406: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.72409: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.73821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.74683: done with get_vars() 30529 1726882657.74704: done getting variables 30529 1726882657.74749: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:37 -0400 (0:00:00.086) 0:01:11.773 ****** 30529 1726882657.74772: entering _queue_task() for managed_node1/dnf 30529 1726882657.75021: worker is 1 (out of 1 available) 30529 1726882657.75035: exiting _queue_task() for managed_node1/dnf 30529 1726882657.75048: done queuing things up, now waiting for results queue to drain 30529 1726882657.75050: waiting for pending results... 30529 1726882657.75233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882657.75333: in run() - task 12673a56-9f93-b0f1-edc0-00000000183f 30529 1726882657.75344: variable 'ansible_search_path' from source: unknown 30529 1726882657.75348: variable 'ansible_search_path' from source: unknown 30529 1726882657.75377: calling self._execute() 30529 1726882657.75450: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.75454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.75463: variable 'omit' from source: magic vars 30529 1726882657.75799: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.75802: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.76003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882657.77861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882657.77912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882657.77939: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882657.77965: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882657.77988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882657.78049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.78078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.78104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.78130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.78141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.78227: variable 'ansible_distribution' from source: facts 30529 1726882657.78231: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.78244: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882657.78325: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882657.78409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.78430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.78447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.78471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.78481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.78513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.78530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.78547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.78571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.78581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.78654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.78658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.78672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.78714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.78727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.78896: variable 'network_connections' from source: include params 30529 1726882657.79098: variable 'interface' from source: play vars 30529 1726882657.79102: variable 'interface' from source: play vars 30529 1726882657.79104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882657.79240: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882657.79284: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882657.79330: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882657.79366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882657.79403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882657.79436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882657.79456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.79489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882657.79650: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882657.79799: variable 'network_connections' from source: include params 30529 1726882657.79810: variable 'interface' from source: play vars 30529 1726882657.79875: variable 'interface' from source: play vars 30529 1726882657.79918: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882657.79927: when evaluation is False, skipping this task 30529 1726882657.79935: _execute() done 30529 1726882657.79941: dumping result to json 30529 1726882657.79947: done dumping result, returning 30529 1726882657.79960: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000183f] 30529 1726882657.79971: sending task result for task 12673a56-9f93-b0f1-edc0-00000000183f skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882657.80138: no more pending results, returning what we have 30529 1726882657.80142: results queue empty 30529 1726882657.80143: checking for any_errors_fatal 30529 1726882657.80152: done checking for any_errors_fatal 30529 1726882657.80152: checking for max_fail_percentage 30529 1726882657.80154: done checking for max_fail_percentage 30529 1726882657.80155: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.80156: done checking to see if all hosts have failed 30529 1726882657.80157: getting the remaining hosts for this loop 30529 1726882657.80159: done getting the remaining hosts for this loop 30529 1726882657.80163: getting the next task for host managed_node1 30529 1726882657.80172: done getting next task for host managed_node1 30529 1726882657.80177: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882657.80182: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.80208: getting variables 30529 1726882657.80210: in VariableManager get_vars() 30529 1726882657.80253: Calling all_inventory to load vars for managed_node1 30529 1726882657.80256: Calling groups_inventory to load vars for managed_node1 30529 1726882657.80258: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.80268: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.80271: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.80274: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.81412: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000183f 30529 1726882657.81416: WORKER PROCESS EXITING 30529 1726882657.81426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.82299: done with get_vars() 30529 1726882657.82316: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882657.82371: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:37 -0400 (0:00:00.076) 0:01:11.850 ****** 30529 1726882657.82424: entering _queue_task() for managed_node1/yum 30529 1726882657.82770: worker is 1 (out of 1 available) 30529 1726882657.82784: exiting _queue_task() for managed_node1/yum 30529 1726882657.82906: done queuing things up, now waiting for results queue to drain 30529 1726882657.82908: waiting for pending results... 30529 1726882657.83158: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882657.83313: in run() - task 12673a56-9f93-b0f1-edc0-000000001840 30529 1726882657.83341: variable 'ansible_search_path' from source: unknown 30529 1726882657.83345: variable 'ansible_search_path' from source: unknown 30529 1726882657.83375: calling self._execute() 30529 1726882657.83450: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.83454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.83462: variable 'omit' from source: magic vars 30529 1726882657.83752: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.83761: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.83880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882657.85653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882657.85697: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882657.85722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882657.85747: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882657.85768: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882657.85827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.85847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.85868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.85896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.85909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.85975: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.85987: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882657.85994: when evaluation is False, skipping this task 30529 1726882657.85999: _execute() done 30529 1726882657.86002: dumping result to json 30529 1726882657.86004: done dumping result, returning 30529 1726882657.86007: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001840] 30529 1726882657.86012: sending task result for task 12673a56-9f93-b0f1-edc0-000000001840 30529 1726882657.86105: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001840 30529 1726882657.86108: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882657.86158: no more pending results, returning what we have 30529 1726882657.86162: results queue empty 30529 1726882657.86163: checking for any_errors_fatal 30529 1726882657.86169: done checking for any_errors_fatal 30529 1726882657.86170: checking for max_fail_percentage 30529 1726882657.86171: done checking for max_fail_percentage 30529 1726882657.86172: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.86173: done checking to see if all hosts have failed 30529 1726882657.86174: getting the remaining hosts for this loop 30529 1726882657.86175: done getting the remaining hosts for this loop 30529 1726882657.86179: getting the next task for host managed_node1 30529 1726882657.86187: done getting next task for host managed_node1 30529 1726882657.86194: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882657.86200: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.86223: getting variables 30529 1726882657.86224: in VariableManager get_vars() 30529 1726882657.86264: Calling all_inventory to load vars for managed_node1 30529 1726882657.86266: Calling groups_inventory to load vars for managed_node1 30529 1726882657.86268: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.86277: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.86279: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.86284: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.87215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.88080: done with get_vars() 30529 1726882657.88099: done getting variables 30529 1726882657.88140: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:37 -0400 (0:00:00.057) 0:01:11.907 ****** 30529 1726882657.88166: entering _queue_task() for managed_node1/fail 30529 1726882657.88406: worker is 1 (out of 1 available) 30529 1726882657.88422: exiting _queue_task() for managed_node1/fail 30529 1726882657.88435: done queuing things up, now waiting for results queue to drain 30529 1726882657.88437: waiting for pending results... 30529 1726882657.88620: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882657.88726: in run() - task 12673a56-9f93-b0f1-edc0-000000001841 30529 1726882657.88737: variable 'ansible_search_path' from source: unknown 30529 1726882657.88741: variable 'ansible_search_path' from source: unknown 30529 1726882657.88768: calling self._execute() 30529 1726882657.88848: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.88852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.88860: variable 'omit' from source: magic vars 30529 1726882657.89142: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.89152: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.89240: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882657.89369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882657.90891: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882657.90940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882657.90969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882657.91000: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882657.91022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882657.91083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.91122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.91136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.91164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.91174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.91210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.91229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.91246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.91272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.91282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.91315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.91337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.91351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.91376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.91386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.91503: variable 'network_connections' from source: include params 30529 1726882657.91512: variable 'interface' from source: play vars 30529 1726882657.91559: variable 'interface' from source: play vars 30529 1726882657.91617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882657.91725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882657.91752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882657.91777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882657.91800: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882657.91832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882657.91847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882657.91865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.91884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882657.91932: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882657.92084: variable 'network_connections' from source: include params 30529 1726882657.92087: variable 'interface' from source: play vars 30529 1726882657.92133: variable 'interface' from source: play vars 30529 1726882657.92157: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882657.92161: when evaluation is False, skipping this task 30529 1726882657.92163: _execute() done 30529 1726882657.92166: dumping result to json 30529 1726882657.92168: done dumping result, returning 30529 1726882657.92174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001841] 30529 1726882657.92179: sending task result for task 12673a56-9f93-b0f1-edc0-000000001841 30529 1726882657.92270: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001841 30529 1726882657.92273: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882657.92326: no more pending results, returning what we have 30529 1726882657.92330: results queue empty 30529 1726882657.92331: checking for any_errors_fatal 30529 1726882657.92338: done checking for any_errors_fatal 30529 1726882657.92339: checking for max_fail_percentage 30529 1726882657.92341: done checking for max_fail_percentage 30529 1726882657.92341: checking to see if all hosts have failed and the running result is not ok 30529 1726882657.92342: done checking to see if all hosts have failed 30529 1726882657.92343: getting the remaining hosts for this loop 30529 1726882657.92345: done getting the remaining hosts for this loop 30529 1726882657.92349: getting the next task for host managed_node1 30529 1726882657.92357: done getting next task for host managed_node1 30529 1726882657.92361: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882657.92366: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882657.92386: getting variables 30529 1726882657.92387: in VariableManager get_vars() 30529 1726882657.92433: Calling all_inventory to load vars for managed_node1 30529 1726882657.92436: Calling groups_inventory to load vars for managed_node1 30529 1726882657.92438: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882657.92447: Calling all_plugins_play to load vars for managed_node1 30529 1726882657.92449: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882657.92451: Calling groups_plugins_play to load vars for managed_node1 30529 1726882657.93270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882657.94244: done with get_vars() 30529 1726882657.94260: done getting variables 30529 1726882657.94306: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:37 -0400 (0:00:00.061) 0:01:11.969 ****** 30529 1726882657.94331: entering _queue_task() for managed_node1/package 30529 1726882657.94566: worker is 1 (out of 1 available) 30529 1726882657.94580: exiting _queue_task() for managed_node1/package 30529 1726882657.94597: done queuing things up, now waiting for results queue to drain 30529 1726882657.94599: waiting for pending results... 30529 1726882657.94773: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882657.94868: in run() - task 12673a56-9f93-b0f1-edc0-000000001842 30529 1726882657.94879: variable 'ansible_search_path' from source: unknown 30529 1726882657.94884: variable 'ansible_search_path' from source: unknown 30529 1726882657.94916: calling self._execute() 30529 1726882657.94991: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882657.95001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882657.95009: variable 'omit' from source: magic vars 30529 1726882657.95287: variable 'ansible_distribution_major_version' from source: facts 30529 1726882657.95301: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882657.95431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882657.95619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882657.95650: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882657.95675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882657.95729: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882657.95813: variable 'network_packages' from source: role '' defaults 30529 1726882657.95877: variable '__network_provider_setup' from source: role '' defaults 30529 1726882657.95885: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882657.95936: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882657.95944: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882657.95985: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882657.96106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882657.97446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882657.97485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882657.97516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882657.97540: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882657.97564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882657.97625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.97645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.97665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.97691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.97706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.97737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.97753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.97773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.97802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.97813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.97951: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882657.98026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.98042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.98059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.98083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.98101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.98159: variable 'ansible_python' from source: facts 30529 1726882657.98172: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882657.98232: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882657.98284: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882657.98369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.98386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.98407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.98437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.98447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.98478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882657.98501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882657.98520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.98547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882657.98557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882657.98654: variable 'network_connections' from source: include params 30529 1726882657.98659: variable 'interface' from source: play vars 30529 1726882657.98730: variable 'interface' from source: play vars 30529 1726882657.98792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882657.98815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882657.98836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882657.98998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882657.99001: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882657.99109: variable 'network_connections' from source: include params 30529 1726882657.99112: variable 'interface' from source: play vars 30529 1726882657.99178: variable 'interface' from source: play vars 30529 1726882657.99222: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882657.99274: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882657.99472: variable 'network_connections' from source: include params 30529 1726882657.99475: variable 'interface' from source: play vars 30529 1726882657.99523: variable 'interface' from source: play vars 30529 1726882657.99544: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882657.99597: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882657.99789: variable 'network_connections' from source: include params 30529 1726882657.99796: variable 'interface' from source: play vars 30529 1726882657.99840: variable 'interface' from source: play vars 30529 1726882657.99885: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882657.99929: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882657.99935: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882657.99977: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882658.00114: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882658.00597: variable 'network_connections' from source: include params 30529 1726882658.00600: variable 'interface' from source: play vars 30529 1726882658.00603: variable 'interface' from source: play vars 30529 1726882658.00605: variable 'ansible_distribution' from source: facts 30529 1726882658.00607: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.00609: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.00611: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882658.00708: variable 'ansible_distribution' from source: facts 30529 1726882658.00716: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.00726: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.00740: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882658.00915: variable 'ansible_distribution' from source: facts 30529 1726882658.00923: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.00932: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.00970: variable 'network_provider' from source: set_fact 30529 1726882658.00999: variable 'ansible_facts' from source: unknown 30529 1726882658.01653: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882658.01666: when evaluation is False, skipping this task 30529 1726882658.01669: _execute() done 30529 1726882658.01672: dumping result to json 30529 1726882658.01674: done dumping result, returning 30529 1726882658.01676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000001842] 30529 1726882658.01678: sending task result for task 12673a56-9f93-b0f1-edc0-000000001842 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882658.01823: no more pending results, returning what we have 30529 1726882658.01826: results queue empty 30529 1726882658.01827: checking for any_errors_fatal 30529 1726882658.01836: done checking for any_errors_fatal 30529 1726882658.01837: checking for max_fail_percentage 30529 1726882658.01838: done checking for max_fail_percentage 30529 1726882658.01839: checking to see if all hosts have failed and the running result is not ok 30529 1726882658.01840: done checking to see if all hosts have failed 30529 1726882658.01841: getting the remaining hosts for this loop 30529 1726882658.01843: done getting the remaining hosts for this loop 30529 1726882658.01846: getting the next task for host managed_node1 30529 1726882658.01854: done getting next task for host managed_node1 30529 1726882658.01859: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882658.01863: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882658.01889: getting variables 30529 1726882658.01890: in VariableManager get_vars() 30529 1726882658.01931: Calling all_inventory to load vars for managed_node1 30529 1726882658.01934: Calling groups_inventory to load vars for managed_node1 30529 1726882658.01941: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882658.01951: Calling all_plugins_play to load vars for managed_node1 30529 1726882658.01954: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882658.01956: Calling groups_plugins_play to load vars for managed_node1 30529 1726882658.02506: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001842 30529 1726882658.02510: WORKER PROCESS EXITING 30529 1726882658.02811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882658.04098: done with get_vars() 30529 1726882658.04123: done getting variables 30529 1726882658.04180: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:38 -0400 (0:00:00.098) 0:01:12.068 ****** 30529 1726882658.04217: entering _queue_task() for managed_node1/package 30529 1726882658.04567: worker is 1 (out of 1 available) 30529 1726882658.04579: exiting _queue_task() for managed_node1/package 30529 1726882658.04795: done queuing things up, now waiting for results queue to drain 30529 1726882658.04797: waiting for pending results... 30529 1726882658.04928: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882658.05048: in run() - task 12673a56-9f93-b0f1-edc0-000000001843 30529 1726882658.05071: variable 'ansible_search_path' from source: unknown 30529 1726882658.05131: variable 'ansible_search_path' from source: unknown 30529 1726882658.05135: calling self._execute() 30529 1726882658.05225: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.05243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.05258: variable 'omit' from source: magic vars 30529 1726882658.05640: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.05658: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882658.05792: variable 'network_state' from source: role '' defaults 30529 1726882658.05810: Evaluated conditional (network_state != {}): False 30529 1726882658.05817: when evaluation is False, skipping this task 30529 1726882658.05891: _execute() done 30529 1726882658.05896: dumping result to json 30529 1726882658.05899: done dumping result, returning 30529 1726882658.05902: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001843] 30529 1726882658.05904: sending task result for task 12673a56-9f93-b0f1-edc0-000000001843 30529 1726882658.05975: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001843 30529 1726882658.05977: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882658.06043: no more pending results, returning what we have 30529 1726882658.06047: results queue empty 30529 1726882658.06049: checking for any_errors_fatal 30529 1726882658.06056: done checking for any_errors_fatal 30529 1726882658.06057: checking for max_fail_percentage 30529 1726882658.06059: done checking for max_fail_percentage 30529 1726882658.06060: checking to see if all hosts have failed and the running result is not ok 30529 1726882658.06061: done checking to see if all hosts have failed 30529 1726882658.06062: getting the remaining hosts for this loop 30529 1726882658.06063: done getting the remaining hosts for this loop 30529 1726882658.06067: getting the next task for host managed_node1 30529 1726882658.06077: done getting next task for host managed_node1 30529 1726882658.06081: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882658.06086: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882658.06112: getting variables 30529 1726882658.06114: in VariableManager get_vars() 30529 1726882658.06158: Calling all_inventory to load vars for managed_node1 30529 1726882658.06161: Calling groups_inventory to load vars for managed_node1 30529 1726882658.06163: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882658.06176: Calling all_plugins_play to load vars for managed_node1 30529 1726882658.06179: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882658.06181: Calling groups_plugins_play to load vars for managed_node1 30529 1726882658.07953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882658.09528: done with get_vars() 30529 1726882658.09556: done getting variables 30529 1726882658.09621: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:38 -0400 (0:00:00.054) 0:01:12.122 ****** 30529 1726882658.09660: entering _queue_task() for managed_node1/package 30529 1726882658.10206: worker is 1 (out of 1 available) 30529 1726882658.10218: exiting _queue_task() for managed_node1/package 30529 1726882658.10232: done queuing things up, now waiting for results queue to drain 30529 1726882658.10233: waiting for pending results... 30529 1726882658.10614: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882658.10676: in run() - task 12673a56-9f93-b0f1-edc0-000000001844 30529 1726882658.10698: variable 'ansible_search_path' from source: unknown 30529 1726882658.10712: variable 'ansible_search_path' from source: unknown 30529 1726882658.10754: calling self._execute() 30529 1726882658.10860: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.10871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.10885: variable 'omit' from source: magic vars 30529 1726882658.11302: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.11313: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882658.11408: variable 'network_state' from source: role '' defaults 30529 1726882658.11415: Evaluated conditional (network_state != {}): False 30529 1726882658.11418: when evaluation is False, skipping this task 30529 1726882658.11421: _execute() done 30529 1726882658.11424: dumping result to json 30529 1726882658.11426: done dumping result, returning 30529 1726882658.11434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001844] 30529 1726882658.11439: sending task result for task 12673a56-9f93-b0f1-edc0-000000001844 30529 1726882658.11532: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001844 30529 1726882658.11535: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882658.11579: no more pending results, returning what we have 30529 1726882658.11582: results queue empty 30529 1726882658.11583: checking for any_errors_fatal 30529 1726882658.11590: done checking for any_errors_fatal 30529 1726882658.11591: checking for max_fail_percentage 30529 1726882658.11592: done checking for max_fail_percentage 30529 1726882658.11595: checking to see if all hosts have failed and the running result is not ok 30529 1726882658.11596: done checking to see if all hosts have failed 30529 1726882658.11596: getting the remaining hosts for this loop 30529 1726882658.11598: done getting the remaining hosts for this loop 30529 1726882658.11602: getting the next task for host managed_node1 30529 1726882658.11611: done getting next task for host managed_node1 30529 1726882658.11614: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882658.11620: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882658.11642: getting variables 30529 1726882658.11644: in VariableManager get_vars() 30529 1726882658.11684: Calling all_inventory to load vars for managed_node1 30529 1726882658.11687: Calling groups_inventory to load vars for managed_node1 30529 1726882658.11689: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882658.11708: Calling all_plugins_play to load vars for managed_node1 30529 1726882658.11710: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882658.11713: Calling groups_plugins_play to load vars for managed_node1 30529 1726882658.12514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882658.13781: done with get_vars() 30529 1726882658.13809: done getting variables 30529 1726882658.13871: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:38 -0400 (0:00:00.042) 0:01:12.165 ****** 30529 1726882658.13916: entering _queue_task() for managed_node1/service 30529 1726882658.14203: worker is 1 (out of 1 available) 30529 1726882658.14217: exiting _queue_task() for managed_node1/service 30529 1726882658.14230: done queuing things up, now waiting for results queue to drain 30529 1726882658.14232: waiting for pending results... 30529 1726882658.14435: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882658.14522: in run() - task 12673a56-9f93-b0f1-edc0-000000001845 30529 1726882658.14535: variable 'ansible_search_path' from source: unknown 30529 1726882658.14538: variable 'ansible_search_path' from source: unknown 30529 1726882658.14567: calling self._execute() 30529 1726882658.14642: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.14646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.14655: variable 'omit' from source: magic vars 30529 1726882658.14931: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.14941: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882658.15026: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882658.15155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882658.17423: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882658.17469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882658.17514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882658.17536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882658.17556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882658.17620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.17643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.17660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.17686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.17699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.17735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.17753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.17769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.17798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.17806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.17836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.17854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.17871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.17896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.17908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.18020: variable 'network_connections' from source: include params 30529 1726882658.18030: variable 'interface' from source: play vars 30529 1726882658.18084: variable 'interface' from source: play vars 30529 1726882658.18134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882658.25017: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882658.25058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882658.25102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882658.25232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882658.25236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882658.25244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882658.25277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.25317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882658.25388: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882658.25698: variable 'network_connections' from source: include params 30529 1726882658.25709: variable 'interface' from source: play vars 30529 1726882658.25786: variable 'interface' from source: play vars 30529 1726882658.25829: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882658.25838: when evaluation is False, skipping this task 30529 1726882658.25844: _execute() done 30529 1726882658.25851: dumping result to json 30529 1726882658.25856: done dumping result, returning 30529 1726882658.25868: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001845] 30529 1726882658.25876: sending task result for task 12673a56-9f93-b0f1-edc0-000000001845 30529 1726882658.26076: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001845 30529 1726882658.26088: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882658.26140: no more pending results, returning what we have 30529 1726882658.26144: results queue empty 30529 1726882658.26145: checking for any_errors_fatal 30529 1726882658.26151: done checking for any_errors_fatal 30529 1726882658.26152: checking for max_fail_percentage 30529 1726882658.26154: done checking for max_fail_percentage 30529 1726882658.26155: checking to see if all hosts have failed and the running result is not ok 30529 1726882658.26155: done checking to see if all hosts have failed 30529 1726882658.26156: getting the remaining hosts for this loop 30529 1726882658.26158: done getting the remaining hosts for this loop 30529 1726882658.26162: getting the next task for host managed_node1 30529 1726882658.26170: done getting next task for host managed_node1 30529 1726882658.26174: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882658.26179: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882658.26206: getting variables 30529 1726882658.26209: in VariableManager get_vars() 30529 1726882658.26251: Calling all_inventory to load vars for managed_node1 30529 1726882658.26254: Calling groups_inventory to load vars for managed_node1 30529 1726882658.26257: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882658.26267: Calling all_plugins_play to load vars for managed_node1 30529 1726882658.26270: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882658.26274: Calling groups_plugins_play to load vars for managed_node1 30529 1726882658.31905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882658.33237: done with get_vars() 30529 1726882658.33255: done getting variables 30529 1726882658.33295: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:38 -0400 (0:00:00.194) 0:01:12.359 ****** 30529 1726882658.33316: entering _queue_task() for managed_node1/service 30529 1726882658.33586: worker is 1 (out of 1 available) 30529 1726882658.33605: exiting _queue_task() for managed_node1/service 30529 1726882658.33619: done queuing things up, now waiting for results queue to drain 30529 1726882658.33621: waiting for pending results... 30529 1726882658.33807: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882658.33916: in run() - task 12673a56-9f93-b0f1-edc0-000000001846 30529 1726882658.33929: variable 'ansible_search_path' from source: unknown 30529 1726882658.33932: variable 'ansible_search_path' from source: unknown 30529 1726882658.33964: calling self._execute() 30529 1726882658.34038: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.34042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.34051: variable 'omit' from source: magic vars 30529 1726882658.34334: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.34344: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882658.34456: variable 'network_provider' from source: set_fact 30529 1726882658.34459: variable 'network_state' from source: role '' defaults 30529 1726882658.34468: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882658.34474: variable 'omit' from source: magic vars 30529 1726882658.34522: variable 'omit' from source: magic vars 30529 1726882658.34543: variable 'network_service_name' from source: role '' defaults 30529 1726882658.34587: variable 'network_service_name' from source: role '' defaults 30529 1726882658.34663: variable '__network_provider_setup' from source: role '' defaults 30529 1726882658.34667: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882658.34712: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882658.34720: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882658.34764: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882658.35099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882658.37032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882658.37117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882658.37159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882658.37191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882658.37226: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882658.37308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.37346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.37377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.37430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.37448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.37496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.37525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.37552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.37592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.37698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.37842: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882658.37954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.37981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.38010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.38052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.38070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.38159: variable 'ansible_python' from source: facts 30529 1726882658.38178: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882658.38260: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882658.38340: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882658.38464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.38495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.38524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.38565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.38697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.38701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882658.38712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882658.38714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.38731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882658.38749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882658.38885: variable 'network_connections' from source: include params 30529 1726882658.38900: variable 'interface' from source: play vars 30529 1726882658.38974: variable 'interface' from source: play vars 30529 1726882658.39082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882658.39255: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882658.39304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882658.39345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882658.39388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882658.39457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882658.39492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882658.39534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882658.39573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882658.39630: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882658.39902: variable 'network_connections' from source: include params 30529 1726882658.39914: variable 'interface' from source: play vars 30529 1726882658.39988: variable 'interface' from source: play vars 30529 1726882658.40103: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882658.40127: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882658.40418: variable 'network_connections' from source: include params 30529 1726882658.40429: variable 'interface' from source: play vars 30529 1726882658.40501: variable 'interface' from source: play vars 30529 1726882658.40529: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882658.40608: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882658.40896: variable 'network_connections' from source: include params 30529 1726882658.40908: variable 'interface' from source: play vars 30529 1726882658.40975: variable 'interface' from source: play vars 30529 1726882658.41098: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882658.41101: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882658.41114: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882658.41176: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882658.41379: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882658.41836: variable 'network_connections' from source: include params 30529 1726882658.41845: variable 'interface' from source: play vars 30529 1726882658.41907: variable 'interface' from source: play vars 30529 1726882658.41920: variable 'ansible_distribution' from source: facts 30529 1726882658.41931: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.41935: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.41965: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882658.42137: variable 'ansible_distribution' from source: facts 30529 1726882658.42146: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.42158: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.42170: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882658.42337: variable 'ansible_distribution' from source: facts 30529 1726882658.42346: variable '__network_rh_distros' from source: role '' defaults 30529 1726882658.42355: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.42395: variable 'network_provider' from source: set_fact 30529 1726882658.42423: variable 'omit' from source: magic vars 30529 1726882658.42453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882658.42484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882658.42511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882658.42532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882658.42546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882658.42578: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882658.42588: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.42599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.42698: Set connection var ansible_shell_executable to /bin/sh 30529 1726882658.42710: Set connection var ansible_pipelining to False 30529 1726882658.42717: Set connection var ansible_shell_type to sh 30529 1726882658.42898: Set connection var ansible_timeout to 10 30529 1726882658.42902: Set connection var ansible_connection to ssh 30529 1726882658.42904: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882658.42908: variable 'ansible_shell_executable' from source: unknown 30529 1726882658.42915: variable 'ansible_connection' from source: unknown 30529 1726882658.42917: variable 'ansible_module_compression' from source: unknown 30529 1726882658.42920: variable 'ansible_shell_type' from source: unknown 30529 1726882658.42922: variable 'ansible_shell_executable' from source: unknown 30529 1726882658.42924: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.42926: variable 'ansible_pipelining' from source: unknown 30529 1726882658.42928: variable 'ansible_timeout' from source: unknown 30529 1726882658.42930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.42933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882658.42939: variable 'omit' from source: magic vars 30529 1726882658.42941: starting attempt loop 30529 1726882658.42943: running the handler 30529 1726882658.42986: variable 'ansible_facts' from source: unknown 30529 1726882658.43739: _low_level_execute_command(): starting 30529 1726882658.43751: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882658.44509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882658.44528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882658.44542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.44616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.46309: stdout chunk (state=3): >>>/root <<< 30529 1726882658.46410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882658.46468: stderr chunk (state=3): >>><<< 30529 1726882658.46484: stdout chunk (state=3): >>><<< 30529 1726882658.46525: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882658.46544: _low_level_execute_command(): starting 30529 1726882658.46555: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427 `" && echo ansible-tmp-1726882658.4653213-33936-129403569346427="` echo /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427 `" ) && sleep 0' 30529 1726882658.47312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882658.47421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882658.47464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.47532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.49397: stdout chunk (state=3): >>>ansible-tmp-1726882658.4653213-33936-129403569346427=/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427 <<< 30529 1726882658.49513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882658.49554: stderr chunk (state=3): >>><<< 30529 1726882658.49568: stdout chunk (state=3): >>><<< 30529 1726882658.49586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882658.4653213-33936-129403569346427=/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882658.49798: variable 'ansible_module_compression' from source: unknown 30529 1726882658.49801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882658.49803: variable 'ansible_facts' from source: unknown 30529 1726882658.49965: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py 30529 1726882658.50197: Sending initial data 30529 1726882658.50207: Sent initial data (156 bytes) 30529 1726882658.50721: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882658.50734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882658.50749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882658.50806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882658.50869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882658.50891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882658.50922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.51000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.52507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882658.52631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882658.52835: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9rsg__0k /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py <<< 30529 1726882658.52848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py" <<< 30529 1726882658.52881: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9rsg__0k" to remote "/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py" <<< 30529 1726882658.54483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882658.54529: stderr chunk (state=3): >>><<< 30529 1726882658.54541: stdout chunk (state=3): >>><<< 30529 1726882658.54564: done transferring module to remote 30529 1726882658.54661: _low_level_execute_command(): starting 30529 1726882658.54664: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/ /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py && sleep 0' 30529 1726882658.55292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882658.55316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882658.55332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.55414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.57168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882658.57171: stdout chunk (state=3): >>><<< 30529 1726882658.57174: stderr chunk (state=3): >>><<< 30529 1726882658.57272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882658.57276: _low_level_execute_command(): starting 30529 1726882658.57279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/AnsiballZ_systemd.py && sleep 0' 30529 1726882658.57913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882658.57948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882658.57970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882658.57991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.58066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.86940: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10850304", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297624064", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1851375000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882658.88578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882658.88598: stderr chunk (state=3): >>><<< 30529 1726882658.88608: stdout chunk (state=3): >>><<< 30529 1726882658.88800: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10850304", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297624064", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1851375000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882658.88834: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882658.88859: _low_level_execute_command(): starting 30529 1726882658.88867: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882658.4653213-33936-129403569346427/ > /dev/null 2>&1 && sleep 0' 30529 1726882658.89469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882658.89481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882658.89498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882658.89580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882658.89615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882658.89633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882658.89653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882658.89721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882658.91516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882658.91528: stdout chunk (state=3): >>><<< 30529 1726882658.91540: stderr chunk (state=3): >>><<< 30529 1726882658.91557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882658.91568: handler run complete 30529 1726882658.91638: attempt loop complete, returning result 30529 1726882658.91647: _execute() done 30529 1726882658.91699: dumping result to json 30529 1726882658.91702: done dumping result, returning 30529 1726882658.91704: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000001846] 30529 1726882658.91706: sending task result for task 12673a56-9f93-b0f1-edc0-000000001846 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882658.92039: no more pending results, returning what we have 30529 1726882658.92043: results queue empty 30529 1726882658.92044: checking for any_errors_fatal 30529 1726882658.92051: done checking for any_errors_fatal 30529 1726882658.92052: checking for max_fail_percentage 30529 1726882658.92054: done checking for max_fail_percentage 30529 1726882658.92055: checking to see if all hosts have failed and the running result is not ok 30529 1726882658.92056: done checking to see if all hosts have failed 30529 1726882658.92057: getting the remaining hosts for this loop 30529 1726882658.92059: done getting the remaining hosts for this loop 30529 1726882658.92063: getting the next task for host managed_node1 30529 1726882658.92071: done getting next task for host managed_node1 30529 1726882658.92075: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882658.92081: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882658.92096: getting variables 30529 1726882658.92098: in VariableManager get_vars() 30529 1726882658.92134: Calling all_inventory to load vars for managed_node1 30529 1726882658.92136: Calling groups_inventory to load vars for managed_node1 30529 1726882658.92139: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882658.92149: Calling all_plugins_play to load vars for managed_node1 30529 1726882658.92152: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882658.92155: Calling groups_plugins_play to load vars for managed_node1 30529 1726882658.93401: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001846 30529 1726882658.93404: WORKER PROCESS EXITING 30529 1726882658.95017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882658.97263: done with get_vars() 30529 1726882658.97286: done getting variables 30529 1726882658.97346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:38 -0400 (0:00:00.640) 0:01:12.999 ****** 30529 1726882658.97385: entering _queue_task() for managed_node1/service 30529 1726882658.97731: worker is 1 (out of 1 available) 30529 1726882658.97745: exiting _queue_task() for managed_node1/service 30529 1726882658.97760: done queuing things up, now waiting for results queue to drain 30529 1726882658.97762: waiting for pending results... 30529 1726882658.98044: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882658.98229: in run() - task 12673a56-9f93-b0f1-edc0-000000001847 30529 1726882658.98249: variable 'ansible_search_path' from source: unknown 30529 1726882658.98256: variable 'ansible_search_path' from source: unknown 30529 1726882658.98302: calling self._execute() 30529 1726882658.98405: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882658.98633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882658.98636: variable 'omit' from source: magic vars 30529 1726882658.99180: variable 'ansible_distribution_major_version' from source: facts 30529 1726882658.99300: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882658.99514: variable 'network_provider' from source: set_fact 30529 1726882658.99518: Evaluated conditional (network_provider == "nm"): True 30529 1726882658.99580: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882658.99679: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882658.99865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882659.01808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882659.01872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882659.01915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882659.01952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882659.01983: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882659.02073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882659.02109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882659.02141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882659.02187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882659.02213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882659.02261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882659.02288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882659.02318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882659.02360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882659.02379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882659.02424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882659.02452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882659.02480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882659.02525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882659.02543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882659.02687: variable 'network_connections' from source: include params 30529 1726882659.02708: variable 'interface' from source: play vars 30529 1726882659.02781: variable 'interface' from source: play vars 30529 1726882659.02862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882659.03018: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882659.03199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882659.03202: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882659.03206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882659.03208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882659.03210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882659.03234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882659.03267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882659.03329: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882659.03580: variable 'network_connections' from source: include params 30529 1726882659.03596: variable 'interface' from source: play vars 30529 1726882659.03670: variable 'interface' from source: play vars 30529 1726882659.03721: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882659.03730: when evaluation is False, skipping this task 30529 1726882659.03738: _execute() done 30529 1726882659.03745: dumping result to json 30529 1726882659.03796: done dumping result, returning 30529 1726882659.03804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000001847] 30529 1726882659.03816: sending task result for task 12673a56-9f93-b0f1-edc0-000000001847 30529 1726882659.04102: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001847 30529 1726882659.04106: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882659.04152: no more pending results, returning what we have 30529 1726882659.04156: results queue empty 30529 1726882659.04157: checking for any_errors_fatal 30529 1726882659.04178: done checking for any_errors_fatal 30529 1726882659.04179: checking for max_fail_percentage 30529 1726882659.04180: done checking for max_fail_percentage 30529 1726882659.04181: checking to see if all hosts have failed and the running result is not ok 30529 1726882659.04182: done checking to see if all hosts have failed 30529 1726882659.04183: getting the remaining hosts for this loop 30529 1726882659.04185: done getting the remaining hosts for this loop 30529 1726882659.04189: getting the next task for host managed_node1 30529 1726882659.04198: done getting next task for host managed_node1 30529 1726882659.04202: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882659.04207: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882659.04229: getting variables 30529 1726882659.04231: in VariableManager get_vars() 30529 1726882659.04274: Calling all_inventory to load vars for managed_node1 30529 1726882659.04276: Calling groups_inventory to load vars for managed_node1 30529 1726882659.04279: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882659.04289: Calling all_plugins_play to load vars for managed_node1 30529 1726882659.04292: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882659.04501: Calling groups_plugins_play to load vars for managed_node1 30529 1726882659.05743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882659.07281: done with get_vars() 30529 1726882659.07306: done getting variables 30529 1726882659.07366: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:39 -0400 (0:00:00.100) 0:01:13.100 ****** 30529 1726882659.07404: entering _queue_task() for managed_node1/service 30529 1726882659.07756: worker is 1 (out of 1 available) 30529 1726882659.07770: exiting _queue_task() for managed_node1/service 30529 1726882659.07783: done queuing things up, now waiting for results queue to drain 30529 1726882659.07784: waiting for pending results... 30529 1726882659.08074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882659.08226: in run() - task 12673a56-9f93-b0f1-edc0-000000001848 30529 1726882659.08247: variable 'ansible_search_path' from source: unknown 30529 1726882659.08256: variable 'ansible_search_path' from source: unknown 30529 1726882659.08301: calling self._execute() 30529 1726882659.08409: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.08599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.08603: variable 'omit' from source: magic vars 30529 1726882659.08829: variable 'ansible_distribution_major_version' from source: facts 30529 1726882659.08847: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882659.08968: variable 'network_provider' from source: set_fact 30529 1726882659.08979: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882659.08987: when evaluation is False, skipping this task 30529 1726882659.08995: _execute() done 30529 1726882659.09003: dumping result to json 30529 1726882659.09009: done dumping result, returning 30529 1726882659.09020: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000001848] 30529 1726882659.09029: sending task result for task 12673a56-9f93-b0f1-edc0-000000001848 30529 1726882659.09144: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001848 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882659.09195: no more pending results, returning what we have 30529 1726882659.09199: results queue empty 30529 1726882659.09200: checking for any_errors_fatal 30529 1726882659.09208: done checking for any_errors_fatal 30529 1726882659.09209: checking for max_fail_percentage 30529 1726882659.09210: done checking for max_fail_percentage 30529 1726882659.09211: checking to see if all hosts have failed and the running result is not ok 30529 1726882659.09212: done checking to see if all hosts have failed 30529 1726882659.09213: getting the remaining hosts for this loop 30529 1726882659.09216: done getting the remaining hosts for this loop 30529 1726882659.09220: getting the next task for host managed_node1 30529 1726882659.09229: done getting next task for host managed_node1 30529 1726882659.09233: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882659.09239: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882659.09266: getting variables 30529 1726882659.09269: in VariableManager get_vars() 30529 1726882659.09317: Calling all_inventory to load vars for managed_node1 30529 1726882659.09319: Calling groups_inventory to load vars for managed_node1 30529 1726882659.09323: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882659.09335: Calling all_plugins_play to load vars for managed_node1 30529 1726882659.09338: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882659.09341: Calling groups_plugins_play to load vars for managed_node1 30529 1726882659.10011: WORKER PROCESS EXITING 30529 1726882659.11087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882659.12580: done with get_vars() 30529 1726882659.12605: done getting variables 30529 1726882659.12665: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:39 -0400 (0:00:00.052) 0:01:13.153 ****** 30529 1726882659.12704: entering _queue_task() for managed_node1/copy 30529 1726882659.13036: worker is 1 (out of 1 available) 30529 1726882659.13050: exiting _queue_task() for managed_node1/copy 30529 1726882659.13063: done queuing things up, now waiting for results queue to drain 30529 1726882659.13064: waiting for pending results... 30529 1726882659.13364: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882659.13528: in run() - task 12673a56-9f93-b0f1-edc0-000000001849 30529 1726882659.13547: variable 'ansible_search_path' from source: unknown 30529 1726882659.13556: variable 'ansible_search_path' from source: unknown 30529 1726882659.13599: calling self._execute() 30529 1726882659.13701: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.13713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.13727: variable 'omit' from source: magic vars 30529 1726882659.14123: variable 'ansible_distribution_major_version' from source: facts 30529 1726882659.14141: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882659.14263: variable 'network_provider' from source: set_fact 30529 1726882659.14275: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882659.14285: when evaluation is False, skipping this task 30529 1726882659.14294: _execute() done 30529 1726882659.14303: dumping result to json 30529 1726882659.14310: done dumping result, returning 30529 1726882659.14322: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000001849] 30529 1726882659.14333: sending task result for task 12673a56-9f93-b0f1-edc0-000000001849 30529 1726882659.14546: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001849 30529 1726882659.14549: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882659.14600: no more pending results, returning what we have 30529 1726882659.14605: results queue empty 30529 1726882659.14606: checking for any_errors_fatal 30529 1726882659.14614: done checking for any_errors_fatal 30529 1726882659.14615: checking for max_fail_percentage 30529 1726882659.14616: done checking for max_fail_percentage 30529 1726882659.14617: checking to see if all hosts have failed and the running result is not ok 30529 1726882659.14618: done checking to see if all hosts have failed 30529 1726882659.14623: getting the remaining hosts for this loop 30529 1726882659.14625: done getting the remaining hosts for this loop 30529 1726882659.14630: getting the next task for host managed_node1 30529 1726882659.14638: done getting next task for host managed_node1 30529 1726882659.14641: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882659.14648: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882659.14669: getting variables 30529 1726882659.14671: in VariableManager get_vars() 30529 1726882659.14712: Calling all_inventory to load vars for managed_node1 30529 1726882659.14715: Calling groups_inventory to load vars for managed_node1 30529 1726882659.14717: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882659.14727: Calling all_plugins_play to load vars for managed_node1 30529 1726882659.14730: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882659.14732: Calling groups_plugins_play to load vars for managed_node1 30529 1726882659.15571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882659.16459: done with get_vars() 30529 1726882659.16474: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:39 -0400 (0:00:00.038) 0:01:13.191 ****** 30529 1726882659.16536: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882659.16990: worker is 1 (out of 1 available) 30529 1726882659.17001: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882659.17012: done queuing things up, now waiting for results queue to drain 30529 1726882659.17013: waiting for pending results... 30529 1726882659.17251: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882659.17380: in run() - task 12673a56-9f93-b0f1-edc0-00000000184a 30529 1726882659.17405: variable 'ansible_search_path' from source: unknown 30529 1726882659.17413: variable 'ansible_search_path' from source: unknown 30529 1726882659.17455: calling self._execute() 30529 1726882659.17550: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.17566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.17670: variable 'omit' from source: magic vars 30529 1726882659.17959: variable 'ansible_distribution_major_version' from source: facts 30529 1726882659.18000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882659.18004: variable 'omit' from source: magic vars 30529 1726882659.18047: variable 'omit' from source: magic vars 30529 1726882659.18218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882659.20300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882659.20304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882659.20307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882659.20309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882659.20311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882659.20505: variable 'network_provider' from source: set_fact 30529 1726882659.20509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882659.20512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882659.20515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882659.20517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882659.20533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882659.20599: variable 'omit' from source: magic vars 30529 1726882659.20696: variable 'omit' from source: magic vars 30529 1726882659.20780: variable 'network_connections' from source: include params 30529 1726882659.20794: variable 'interface' from source: play vars 30529 1726882659.20853: variable 'interface' from source: play vars 30529 1726882659.20991: variable 'omit' from source: magic vars 30529 1726882659.20998: variable '__lsr_ansible_managed' from source: task vars 30529 1726882659.21054: variable '__lsr_ansible_managed' from source: task vars 30529 1726882659.21213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882659.21423: Loaded config def from plugin (lookup/template) 30529 1726882659.21427: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882659.21455: File lookup term: get_ansible_managed.j2 30529 1726882659.21458: variable 'ansible_search_path' from source: unknown 30529 1726882659.21461: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882659.21475: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882659.21494: variable 'ansible_search_path' from source: unknown 30529 1726882659.29725: variable 'ansible_managed' from source: unknown 30529 1726882659.29885: variable 'omit' from source: magic vars 30529 1726882659.29931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882659.29964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882659.30000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882659.30026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882659.30082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882659.30085: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882659.30192: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.30199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.30222: Set connection var ansible_shell_executable to /bin/sh 30529 1726882659.30235: Set connection var ansible_pipelining to False 30529 1726882659.30244: Set connection var ansible_shell_type to sh 30529 1726882659.30259: Set connection var ansible_timeout to 10 30529 1726882659.30317: Set connection var ansible_connection to ssh 30529 1726882659.30321: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882659.30323: variable 'ansible_shell_executable' from source: unknown 30529 1726882659.30326: variable 'ansible_connection' from source: unknown 30529 1726882659.30330: variable 'ansible_module_compression' from source: unknown 30529 1726882659.30338: variable 'ansible_shell_type' from source: unknown 30529 1726882659.30346: variable 'ansible_shell_executable' from source: unknown 30529 1726882659.30355: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.30364: variable 'ansible_pipelining' from source: unknown 30529 1726882659.30371: variable 'ansible_timeout' from source: unknown 30529 1726882659.30380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.30537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882659.30645: variable 'omit' from source: magic vars 30529 1726882659.30648: starting attempt loop 30529 1726882659.30651: running the handler 30529 1726882659.30653: _low_level_execute_command(): starting 30529 1726882659.30655: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882659.31603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882659.31621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882659.31635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882659.31655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882659.31679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882659.31703: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882659.31807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882659.31822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882659.31934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.31978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.33652: stdout chunk (state=3): >>>/root <<< 30529 1726882659.33799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882659.33803: stdout chunk (state=3): >>><<< 30529 1726882659.33825: stderr chunk (state=3): >>><<< 30529 1726882659.33930: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882659.33934: _low_level_execute_command(): starting 30529 1726882659.33937: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149 `" && echo ansible-tmp-1726882659.338443-33985-151309242014149="` echo /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149 `" ) && sleep 0' 30529 1726882659.34737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882659.34770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882659.34785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882659.34909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882659.34998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882659.35114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.35202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.37035: stdout chunk (state=3): >>>ansible-tmp-1726882659.338443-33985-151309242014149=/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149 <<< 30529 1726882659.37206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882659.37210: stdout chunk (state=3): >>><<< 30529 1726882659.37212: stderr chunk (state=3): >>><<< 30529 1726882659.37231: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882659.338443-33985-151309242014149=/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882659.37281: variable 'ansible_module_compression' from source: unknown 30529 1726882659.37343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882659.37573: variable 'ansible_facts' from source: unknown 30529 1726882659.37576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py 30529 1726882659.37720: Sending initial data 30529 1726882659.37724: Sent initial data (167 bytes) 30529 1726882659.38278: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882659.38302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882659.38319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882659.38451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882659.38456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882659.38507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.38540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.40175: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882659.40180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882659.40183: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpimrws1mg /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py <<< 30529 1726882659.40185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py" <<< 30529 1726882659.40254: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpimrws1mg" to remote "/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py" <<< 30529 1726882659.41329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882659.41474: stderr chunk (state=3): >>><<< 30529 1726882659.41478: stdout chunk (state=3): >>><<< 30529 1726882659.41480: done transferring module to remote 30529 1726882659.41482: _low_level_execute_command(): starting 30529 1726882659.41485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/ /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py && sleep 0' 30529 1726882659.42605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882659.42797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882659.42806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882659.42824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.42890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.44613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882659.44652: stderr chunk (state=3): >>><<< 30529 1726882659.44657: stdout chunk (state=3): >>><<< 30529 1726882659.44676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882659.44680: _low_level_execute_command(): starting 30529 1726882659.44684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/AnsiballZ_network_connections.py && sleep 0' 30529 1726882659.45305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882659.45405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882659.45421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882659.45432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.45513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.73034: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 30529 1726882659.73112: stdout chunk (state=3): >>> <<< 30529 1726882659.75901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882659.75905: stdout chunk (state=3): >>><<< 30529 1726882659.75909: stderr chunk (state=3): >>><<< 30529 1726882659.75912: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882659.76149: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882659.76152: _low_level_execute_command(): starting 30529 1726882659.76155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882659.338443-33985-151309242014149/ > /dev/null 2>&1 && sleep 0' 30529 1726882659.77207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882659.77211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882659.77215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882659.77218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882659.77220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882659.77222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882659.77553: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882659.77709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882659.77783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882659.79568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882659.79664: stderr chunk (state=3): >>><<< 30529 1726882659.79705: stdout chunk (state=3): >>><<< 30529 1726882659.79735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882659.79855: handler run complete 30529 1726882659.79863: attempt loop complete, returning result 30529 1726882659.80145: _execute() done 30529 1726882659.80148: dumping result to json 30529 1726882659.80150: done dumping result, returning 30529 1726882659.80153: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-00000000184a] 30529 1726882659.80156: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184a 30529 1726882659.80259: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184a 30529 1726882659.80262: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 30529 1726882659.80611: no more pending results, returning what we have 30529 1726882659.80615: results queue empty 30529 1726882659.80616: checking for any_errors_fatal 30529 1726882659.80621: done checking for any_errors_fatal 30529 1726882659.80622: checking for max_fail_percentage 30529 1726882659.80623: done checking for max_fail_percentage 30529 1726882659.80624: checking to see if all hosts have failed and the running result is not ok 30529 1726882659.80625: done checking to see if all hosts have failed 30529 1726882659.80626: getting the remaining hosts for this loop 30529 1726882659.80628: done getting the remaining hosts for this loop 30529 1726882659.80631: getting the next task for host managed_node1 30529 1726882659.80638: done getting next task for host managed_node1 30529 1726882659.80642: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882659.80647: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882659.80659: getting variables 30529 1726882659.80661: in VariableManager get_vars() 30529 1726882659.80939: Calling all_inventory to load vars for managed_node1 30529 1726882659.80942: Calling groups_inventory to load vars for managed_node1 30529 1726882659.80944: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882659.80954: Calling all_plugins_play to load vars for managed_node1 30529 1726882659.80958: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882659.80961: Calling groups_plugins_play to load vars for managed_node1 30529 1726882659.85300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882659.88501: done with get_vars() 30529 1726882659.88528: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:39 -0400 (0:00:00.722) 0:01:13.914 ****** 30529 1726882659.88824: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882659.89749: worker is 1 (out of 1 available) 30529 1726882659.89760: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882659.89773: done queuing things up, now waiting for results queue to drain 30529 1726882659.89774: waiting for pending results... 30529 1726882659.90018: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882659.90200: in run() - task 12673a56-9f93-b0f1-edc0-00000000184b 30529 1726882659.90205: variable 'ansible_search_path' from source: unknown 30529 1726882659.90208: variable 'ansible_search_path' from source: unknown 30529 1726882659.90217: calling self._execute() 30529 1726882659.90294: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.90309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.90332: variable 'omit' from source: magic vars 30529 1726882659.90783: variable 'ansible_distribution_major_version' from source: facts 30529 1726882659.90808: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882659.90951: variable 'network_state' from source: role '' defaults 30529 1726882659.90968: Evaluated conditional (network_state != {}): False 30529 1726882659.90982: when evaluation is False, skipping this task 30529 1726882659.91034: _execute() done 30529 1726882659.91038: dumping result to json 30529 1726882659.91040: done dumping result, returning 30529 1726882659.91043: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-00000000184b] 30529 1726882659.91046: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882659.91357: no more pending results, returning what we have 30529 1726882659.91362: results queue empty 30529 1726882659.91364: checking for any_errors_fatal 30529 1726882659.91379: done checking for any_errors_fatal 30529 1726882659.91380: checking for max_fail_percentage 30529 1726882659.91382: done checking for max_fail_percentage 30529 1726882659.91383: checking to see if all hosts have failed and the running result is not ok 30529 1726882659.91384: done checking to see if all hosts have failed 30529 1726882659.91384: getting the remaining hosts for this loop 30529 1726882659.91386: done getting the remaining hosts for this loop 30529 1726882659.91395: getting the next task for host managed_node1 30529 1726882659.91403: done getting next task for host managed_node1 30529 1726882659.91408: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882659.91417: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882659.91440: getting variables 30529 1726882659.91442: in VariableManager get_vars() 30529 1726882659.91485: Calling all_inventory to load vars for managed_node1 30529 1726882659.91491: Calling groups_inventory to load vars for managed_node1 30529 1726882659.91707: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882659.91718: Calling all_plugins_play to load vars for managed_node1 30529 1726882659.91721: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882659.91724: Calling groups_plugins_play to load vars for managed_node1 30529 1726882659.92326: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184b 30529 1726882659.92329: WORKER PROCESS EXITING 30529 1726882659.94252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882659.97475: done with get_vars() 30529 1726882659.97514: done getting variables 30529 1726882659.97575: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:39 -0400 (0:00:00.088) 0:01:14.003 ****** 30529 1726882659.97722: entering _queue_task() for managed_node1/debug 30529 1726882659.98627: worker is 1 (out of 1 available) 30529 1726882659.98638: exiting _queue_task() for managed_node1/debug 30529 1726882659.98650: done queuing things up, now waiting for results queue to drain 30529 1726882659.98652: waiting for pending results... 30529 1726882659.98987: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882659.99410: in run() - task 12673a56-9f93-b0f1-edc0-00000000184c 30529 1726882659.99415: variable 'ansible_search_path' from source: unknown 30529 1726882659.99418: variable 'ansible_search_path' from source: unknown 30529 1726882659.99598: calling self._execute() 30529 1726882659.99707: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882659.99718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882659.99736: variable 'omit' from source: magic vars 30529 1726882660.00536: variable 'ansible_distribution_major_version' from source: facts 30529 1726882660.00554: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882660.00566: variable 'omit' from source: magic vars 30529 1726882660.00758: variable 'omit' from source: magic vars 30529 1726882660.00801: variable 'omit' from source: magic vars 30529 1726882660.00867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882660.00972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882660.01022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882660.01062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.01198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.01208: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882660.01218: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.01226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.01456: Set connection var ansible_shell_executable to /bin/sh 30529 1726882660.01490: Set connection var ansible_pipelining to False 30529 1726882660.01501: Set connection var ansible_shell_type to sh 30529 1726882660.01698: Set connection var ansible_timeout to 10 30529 1726882660.01701: Set connection var ansible_connection to ssh 30529 1726882660.01704: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882660.01706: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.01708: variable 'ansible_connection' from source: unknown 30529 1726882660.01711: variable 'ansible_module_compression' from source: unknown 30529 1726882660.01713: variable 'ansible_shell_type' from source: unknown 30529 1726882660.01715: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.01717: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.01720: variable 'ansible_pipelining' from source: unknown 30529 1726882660.01722: variable 'ansible_timeout' from source: unknown 30529 1726882660.01724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.02133: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882660.02137: variable 'omit' from source: magic vars 30529 1726882660.02139: starting attempt loop 30529 1726882660.02141: running the handler 30529 1726882660.02420: variable '__network_connections_result' from source: set_fact 30529 1726882660.02478: handler run complete 30529 1726882660.02508: attempt loop complete, returning result 30529 1726882660.02516: _execute() done 30529 1726882660.02526: dumping result to json 30529 1726882660.02537: done dumping result, returning 30529 1726882660.02552: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000184c] 30529 1726882660.02561: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184c 30529 1726882660.02933: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184c 30529 1726882660.02936: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7" ] } 30529 1726882660.03012: no more pending results, returning what we have 30529 1726882660.03017: results queue empty 30529 1726882660.03018: checking for any_errors_fatal 30529 1726882660.03026: done checking for any_errors_fatal 30529 1726882660.03026: checking for max_fail_percentage 30529 1726882660.03028: done checking for max_fail_percentage 30529 1726882660.03030: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.03031: done checking to see if all hosts have failed 30529 1726882660.03032: getting the remaining hosts for this loop 30529 1726882660.03033: done getting the remaining hosts for this loop 30529 1726882660.03037: getting the next task for host managed_node1 30529 1726882660.03046: done getting next task for host managed_node1 30529 1726882660.03050: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882660.03056: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.03071: getting variables 30529 1726882660.03074: in VariableManager get_vars() 30529 1726882660.03324: Calling all_inventory to load vars for managed_node1 30529 1726882660.03327: Calling groups_inventory to load vars for managed_node1 30529 1726882660.03330: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.03340: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.03343: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.03346: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.06536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882660.09804: done with get_vars() 30529 1726882660.09833: done getting variables 30529 1726882660.10132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:40 -0400 (0:00:00.124) 0:01:14.127 ****** 30529 1726882660.10174: entering _queue_task() for managed_node1/debug 30529 1726882660.10910: worker is 1 (out of 1 available) 30529 1726882660.11012: exiting _queue_task() for managed_node1/debug 30529 1726882660.11141: done queuing things up, now waiting for results queue to drain 30529 1726882660.11143: waiting for pending results... 30529 1726882660.11699: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882660.12399: in run() - task 12673a56-9f93-b0f1-edc0-00000000184d 30529 1726882660.12403: variable 'ansible_search_path' from source: unknown 30529 1726882660.12406: variable 'ansible_search_path' from source: unknown 30529 1726882660.12410: calling self._execute() 30529 1726882660.12999: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.13002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.13005: variable 'omit' from source: magic vars 30529 1726882660.14199: variable 'ansible_distribution_major_version' from source: facts 30529 1726882660.14203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882660.14205: variable 'omit' from source: magic vars 30529 1726882660.14208: variable 'omit' from source: magic vars 30529 1726882660.14210: variable 'omit' from source: magic vars 30529 1726882660.14212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882660.14604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882660.14632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882660.14653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.14669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.14998: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882660.15001: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.15003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.15398: Set connection var ansible_shell_executable to /bin/sh 30529 1726882660.15401: Set connection var ansible_pipelining to False 30529 1726882660.15403: Set connection var ansible_shell_type to sh 30529 1726882660.15405: Set connection var ansible_timeout to 10 30529 1726882660.15407: Set connection var ansible_connection to ssh 30529 1726882660.15409: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882660.15411: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.15413: variable 'ansible_connection' from source: unknown 30529 1726882660.15415: variable 'ansible_module_compression' from source: unknown 30529 1726882660.15417: variable 'ansible_shell_type' from source: unknown 30529 1726882660.15419: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.15420: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.15422: variable 'ansible_pipelining' from source: unknown 30529 1726882660.15424: variable 'ansible_timeout' from source: unknown 30529 1726882660.15426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.15840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882660.15858: variable 'omit' from source: magic vars 30529 1726882660.15868: starting attempt loop 30529 1726882660.15874: running the handler 30529 1726882660.15927: variable '__network_connections_result' from source: set_fact 30529 1726882660.16174: variable '__network_connections_result' from source: set_fact 30529 1726882660.16516: handler run complete 30529 1726882660.16629: attempt loop complete, returning result 30529 1726882660.16804: _execute() done 30529 1726882660.17198: dumping result to json 30529 1726882660.17201: done dumping result, returning 30529 1726882660.17204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000184d] 30529 1726882660.17206: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184d 30529 1726882660.17288: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184d 30529 1726882660.17294: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7" ] } } 30529 1726882660.17397: no more pending results, returning what we have 30529 1726882660.17401: results queue empty 30529 1726882660.17402: checking for any_errors_fatal 30529 1726882660.17410: done checking for any_errors_fatal 30529 1726882660.17411: checking for max_fail_percentage 30529 1726882660.17413: done checking for max_fail_percentage 30529 1726882660.17414: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.17415: done checking to see if all hosts have failed 30529 1726882660.17416: getting the remaining hosts for this loop 30529 1726882660.17417: done getting the remaining hosts for this loop 30529 1726882660.17421: getting the next task for host managed_node1 30529 1726882660.17431: done getting next task for host managed_node1 30529 1726882660.17435: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882660.17442: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.17457: getting variables 30529 1726882660.17459: in VariableManager get_vars() 30529 1726882660.17625: Calling all_inventory to load vars for managed_node1 30529 1726882660.17627: Calling groups_inventory to load vars for managed_node1 30529 1726882660.17630: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.17641: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.17644: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.17647: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.20817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882660.24811: done with get_vars() 30529 1726882660.24947: done getting variables 30529 1726882660.25118: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:40 -0400 (0:00:00.150) 0:01:14.278 ****** 30529 1726882660.25268: entering _queue_task() for managed_node1/debug 30529 1726882660.25980: worker is 1 (out of 1 available) 30529 1726882660.26119: exiting _queue_task() for managed_node1/debug 30529 1726882660.26131: done queuing things up, now waiting for results queue to drain 30529 1726882660.26133: waiting for pending results... 30529 1726882660.26701: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882660.26999: in run() - task 12673a56-9f93-b0f1-edc0-00000000184e 30529 1726882660.27003: variable 'ansible_search_path' from source: unknown 30529 1726882660.27006: variable 'ansible_search_path' from source: unknown 30529 1726882660.27198: calling self._execute() 30529 1726882660.27338: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.27342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.27355: variable 'omit' from source: magic vars 30529 1726882660.28157: variable 'ansible_distribution_major_version' from source: facts 30529 1726882660.28233: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882660.28478: variable 'network_state' from source: role '' defaults 30529 1726882660.28487: Evaluated conditional (network_state != {}): False 30529 1726882660.28490: when evaluation is False, skipping this task 30529 1726882660.28496: _execute() done 30529 1726882660.28499: dumping result to json 30529 1726882660.28507: done dumping result, returning 30529 1726882660.28580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-00000000184e] 30529 1726882660.28617: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184e skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882660.28741: no more pending results, returning what we have 30529 1726882660.28745: results queue empty 30529 1726882660.28747: checking for any_errors_fatal 30529 1726882660.28757: done checking for any_errors_fatal 30529 1726882660.28758: checking for max_fail_percentage 30529 1726882660.28761: done checking for max_fail_percentage 30529 1726882660.28762: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.28763: done checking to see if all hosts have failed 30529 1726882660.28763: getting the remaining hosts for this loop 30529 1726882660.28765: done getting the remaining hosts for this loop 30529 1726882660.28770: getting the next task for host managed_node1 30529 1726882660.28783: done getting next task for host managed_node1 30529 1726882660.28789: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882660.28798: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.28825: getting variables 30529 1726882660.28828: in VariableManager get_vars() 30529 1726882660.28871: Calling all_inventory to load vars for managed_node1 30529 1726882660.28874: Calling groups_inventory to load vars for managed_node1 30529 1726882660.28877: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.29136: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.29140: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.29145: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.29664: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184e 30529 1726882660.29667: WORKER PROCESS EXITING 30529 1726882660.32753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882660.36563: done with get_vars() 30529 1726882660.36697: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:40 -0400 (0:00:00.115) 0:01:14.394 ****** 30529 1726882660.36801: entering _queue_task() for managed_node1/ping 30529 1726882660.37669: worker is 1 (out of 1 available) 30529 1726882660.37685: exiting _queue_task() for managed_node1/ping 30529 1726882660.37704: done queuing things up, now waiting for results queue to drain 30529 1726882660.37706: waiting for pending results... 30529 1726882660.38640: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882660.39283: in run() - task 12673a56-9f93-b0f1-edc0-00000000184f 30529 1726882660.39287: variable 'ansible_search_path' from source: unknown 30529 1726882660.39289: variable 'ansible_search_path' from source: unknown 30529 1726882660.39292: calling self._execute() 30529 1726882660.39463: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.39850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.39854: variable 'omit' from source: magic vars 30529 1726882660.40885: variable 'ansible_distribution_major_version' from source: facts 30529 1726882660.40949: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882660.40961: variable 'omit' from source: magic vars 30529 1726882660.41152: variable 'omit' from source: magic vars 30529 1726882660.41290: variable 'omit' from source: magic vars 30529 1726882660.41699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882660.41702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882660.41705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882660.41721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.41738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.41838: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882660.41848: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.41855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.42201: Set connection var ansible_shell_executable to /bin/sh 30529 1726882660.42204: Set connection var ansible_pipelining to False 30529 1726882660.42206: Set connection var ansible_shell_type to sh 30529 1726882660.42208: Set connection var ansible_timeout to 10 30529 1726882660.42210: Set connection var ansible_connection to ssh 30529 1726882660.42212: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882660.42214: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.42216: variable 'ansible_connection' from source: unknown 30529 1726882660.42218: variable 'ansible_module_compression' from source: unknown 30529 1726882660.42220: variable 'ansible_shell_type' from source: unknown 30529 1726882660.42222: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.42224: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.42226: variable 'ansible_pipelining' from source: unknown 30529 1726882660.42228: variable 'ansible_timeout' from source: unknown 30529 1726882660.42231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.43045: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882660.43063: variable 'omit' from source: magic vars 30529 1726882660.43159: starting attempt loop 30529 1726882660.43169: running the handler 30529 1726882660.43187: _low_level_execute_command(): starting 30529 1726882660.43201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882660.44911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.44932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882660.44946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882660.44957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.45229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.46712: stdout chunk (state=3): >>>/root <<< 30529 1726882660.46989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882660.47108: stderr chunk (state=3): >>><<< 30529 1726882660.47117: stdout chunk (state=3): >>><<< 30529 1726882660.47148: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882660.47186: _low_level_execute_command(): starting 30529 1726882660.47364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362 `" && echo ansible-tmp-1726882660.4717247-34044-257413179536362="` echo /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362 `" ) && sleep 0' 30529 1726882660.48487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.48674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.48717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.50584: stdout chunk (state=3): >>>ansible-tmp-1726882660.4717247-34044-257413179536362=/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362 <<< 30529 1726882660.50728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882660.50738: stdout chunk (state=3): >>><<< 30529 1726882660.50753: stderr chunk (state=3): >>><<< 30529 1726882660.50773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882660.4717247-34044-257413179536362=/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882660.50969: variable 'ansible_module_compression' from source: unknown 30529 1726882660.51012: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882660.51186: variable 'ansible_facts' from source: unknown 30529 1726882660.51482: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py 30529 1726882660.51819: Sending initial data 30529 1726882660.51823: Sent initial data (153 bytes) 30529 1726882660.53259: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882660.53412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.53740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882660.53743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882660.53831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.53962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.55394: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882660.55412: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882660.55546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882660.55609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3fjii60e /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py <<< 30529 1726882660.55612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py" <<< 30529 1726882660.55703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp3fjii60e" to remote "/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py" <<< 30529 1726882660.56736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882660.56832: stderr chunk (state=3): >>><<< 30529 1726882660.56835: stdout chunk (state=3): >>><<< 30529 1726882660.56838: done transferring module to remote 30529 1726882660.56840: _low_level_execute_command(): starting 30529 1726882660.56941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/ /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py && sleep 0' 30529 1726882660.58208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.58265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882660.58309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882660.58322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.58482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.60229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882660.60239: stdout chunk (state=3): >>><<< 30529 1726882660.60251: stderr chunk (state=3): >>><<< 30529 1726882660.60277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882660.60315: _low_level_execute_command(): starting 30529 1726882660.60325: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/AnsiballZ_ping.py && sleep 0' 30529 1726882660.61599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882660.61603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882660.61605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882660.61620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.61720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882660.61747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882660.61771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.61865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.76904: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882660.78003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882660.78006: stdout chunk (state=3): >>><<< 30529 1726882660.78009: stderr chunk (state=3): >>><<< 30529 1726882660.78011: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882660.78015: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882660.78018: _low_level_execute_command(): starting 30529 1726882660.78020: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882660.4717247-34044-257413179536362/ > /dev/null 2>&1 && sleep 0' 30529 1726882660.79653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882660.79718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882660.79933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882660.79938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882660.80044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882660.80080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882660.80126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882660.81951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882660.81960: stdout chunk (state=3): >>><<< 30529 1726882660.81970: stderr chunk (state=3): >>><<< 30529 1726882660.81990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882660.82018: handler run complete 30529 1726882660.82037: attempt loop complete, returning result 30529 1726882660.82048: _execute() done 30529 1726882660.82055: dumping result to json 30529 1726882660.82062: done dumping result, returning 30529 1726882660.82074: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-00000000184f] 30529 1726882660.82082: sending task result for task 12673a56-9f93-b0f1-edc0-00000000184f ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882660.82533: no more pending results, returning what we have 30529 1726882660.82537: results queue empty 30529 1726882660.82538: checking for any_errors_fatal 30529 1726882660.82543: done checking for any_errors_fatal 30529 1726882660.82543: checking for max_fail_percentage 30529 1726882660.82545: done checking for max_fail_percentage 30529 1726882660.82546: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.82547: done checking to see if all hosts have failed 30529 1726882660.82548: getting the remaining hosts for this loop 30529 1726882660.82550: done getting the remaining hosts for this loop 30529 1726882660.82553: getting the next task for host managed_node1 30529 1726882660.82563: done getting next task for host managed_node1 30529 1726882660.82565: ^ task is: TASK: meta (role_complete) 30529 1726882660.82570: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.82582: getting variables 30529 1726882660.82584: in VariableManager get_vars() 30529 1726882660.82670: Calling all_inventory to load vars for managed_node1 30529 1726882660.82673: Calling groups_inventory to load vars for managed_node1 30529 1726882660.82676: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.82682: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000184f 30529 1726882660.82685: WORKER PROCESS EXITING 30529 1726882660.82700: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.82704: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.82707: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.85404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882660.87125: done with get_vars() 30529 1726882660.87149: done getting variables 30529 1726882660.87367: done queuing things up, now waiting for results queue to drain 30529 1726882660.87369: results queue empty 30529 1726882660.87370: checking for any_errors_fatal 30529 1726882660.87373: done checking for any_errors_fatal 30529 1726882660.87374: checking for max_fail_percentage 30529 1726882660.87375: done checking for max_fail_percentage 30529 1726882660.87376: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.87377: done checking to see if all hosts have failed 30529 1726882660.87377: getting the remaining hosts for this loop 30529 1726882660.87378: done getting the remaining hosts for this loop 30529 1726882660.87381: getting the next task for host managed_node1 30529 1726882660.87386: done getting next task for host managed_node1 30529 1726882660.87392: ^ task is: TASK: Show result 30529 1726882660.87396: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.87399: getting variables 30529 1726882660.87400: in VariableManager get_vars() 30529 1726882660.87412: Calling all_inventory to load vars for managed_node1 30529 1726882660.87414: Calling groups_inventory to load vars for managed_node1 30529 1726882660.87417: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.87422: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.87424: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.87427: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.89327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882660.91807: done with get_vars() 30529 1726882660.91839: done getting variables 30529 1726882660.91890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:37:40 -0400 (0:00:00.551) 0:01:14.945 ****** 30529 1726882660.91934: entering _queue_task() for managed_node1/debug 30529 1726882660.92518: worker is 1 (out of 1 available) 30529 1726882660.92530: exiting _queue_task() for managed_node1/debug 30529 1726882660.92542: done queuing things up, now waiting for results queue to drain 30529 1726882660.92544: waiting for pending results... 30529 1726882660.92906: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882660.93140: in run() - task 12673a56-9f93-b0f1-edc0-0000000017d1 30529 1726882660.93161: variable 'ansible_search_path' from source: unknown 30529 1726882660.93215: variable 'ansible_search_path' from source: unknown 30529 1726882660.93260: calling self._execute() 30529 1726882660.93827: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.93831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.93834: variable 'omit' from source: magic vars 30529 1726882660.94237: variable 'ansible_distribution_major_version' from source: facts 30529 1726882660.94263: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882660.94276: variable 'omit' from source: magic vars 30529 1726882660.94331: variable 'omit' from source: magic vars 30529 1726882660.94381: variable 'omit' from source: magic vars 30529 1726882660.94428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882660.94477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882660.94510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882660.94533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.94550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882660.94597: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882660.94609: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.94617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.94736: Set connection var ansible_shell_executable to /bin/sh 30529 1726882660.94748: Set connection var ansible_pipelining to False 30529 1726882660.94755: Set connection var ansible_shell_type to sh 30529 1726882660.94767: Set connection var ansible_timeout to 10 30529 1726882660.94773: Set connection var ansible_connection to ssh 30529 1726882660.94781: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882660.94819: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.94829: variable 'ansible_connection' from source: unknown 30529 1726882660.94836: variable 'ansible_module_compression' from source: unknown 30529 1726882660.94843: variable 'ansible_shell_type' from source: unknown 30529 1726882660.94853: variable 'ansible_shell_executable' from source: unknown 30529 1726882660.94861: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882660.94868: variable 'ansible_pipelining' from source: unknown 30529 1726882660.94876: variable 'ansible_timeout' from source: unknown 30529 1726882660.94884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882660.95050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882660.95072: variable 'omit' from source: magic vars 30529 1726882660.95084: starting attempt loop 30529 1726882660.95095: running the handler 30529 1726882660.95152: variable '__network_connections_result' from source: set_fact 30529 1726882660.95244: variable '__network_connections_result' from source: set_fact 30529 1726882660.95379: handler run complete 30529 1726882660.95417: attempt loop complete, returning result 30529 1726882660.95429: _execute() done 30529 1726882660.95436: dumping result to json 30529 1726882660.95451: done dumping result, returning 30529 1726882660.95502: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-0000000017d1] 30529 1726882660.95508: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d1 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7" ] } } 30529 1726882660.95851: no more pending results, returning what we have 30529 1726882660.95859: results queue empty 30529 1726882660.95860: checking for any_errors_fatal 30529 1726882660.95862: done checking for any_errors_fatal 30529 1726882660.95863: checking for max_fail_percentage 30529 1726882660.95865: done checking for max_fail_percentage 30529 1726882660.95869: checking to see if all hosts have failed and the running result is not ok 30529 1726882660.95871: done checking to see if all hosts have failed 30529 1726882660.95871: getting the remaining hosts for this loop 30529 1726882660.95874: done getting the remaining hosts for this loop 30529 1726882660.95878: getting the next task for host managed_node1 30529 1726882660.95947: done getting next task for host managed_node1 30529 1726882660.95952: ^ task is: TASK: Include network role 30529 1726882660.95956: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882660.95961: getting variables 30529 1726882660.95963: in VariableManager get_vars() 30529 1726882660.96119: Calling all_inventory to load vars for managed_node1 30529 1726882660.96122: Calling groups_inventory to load vars for managed_node1 30529 1726882660.96126: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882660.96138: Calling all_plugins_play to load vars for managed_node1 30529 1726882660.96141: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882660.96145: Calling groups_plugins_play to load vars for managed_node1 30529 1726882660.96770: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d1 30529 1726882660.96774: WORKER PROCESS EXITING 30529 1726882660.98653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.00895: done with get_vars() 30529 1726882661.00935: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:37:41 -0400 (0:00:00.091) 0:01:15.036 ****** 30529 1726882661.01045: entering _queue_task() for managed_node1/include_role 30529 1726882661.01433: worker is 1 (out of 1 available) 30529 1726882661.01446: exiting _queue_task() for managed_node1/include_role 30529 1726882661.01464: done queuing things up, now waiting for results queue to drain 30529 1726882661.01466: waiting for pending results... 30529 1726882661.01776: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882661.01934: in run() - task 12673a56-9f93-b0f1-edc0-0000000017d5 30529 1726882661.01957: variable 'ansible_search_path' from source: unknown 30529 1726882661.01966: variable 'ansible_search_path' from source: unknown 30529 1726882661.02019: calling self._execute() 30529 1726882661.02128: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.02228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.02232: variable 'omit' from source: magic vars 30529 1726882661.02575: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.02599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.02611: _execute() done 30529 1726882661.02619: dumping result to json 30529 1726882661.02626: done dumping result, returning 30529 1726882661.02636: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-0000000017d5] 30529 1726882661.02645: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d5 30529 1726882661.03010: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d5 30529 1726882661.03013: WORKER PROCESS EXITING 30529 1726882661.03041: no more pending results, returning what we have 30529 1726882661.03046: in VariableManager get_vars() 30529 1726882661.03091: Calling all_inventory to load vars for managed_node1 30529 1726882661.03096: Calling groups_inventory to load vars for managed_node1 30529 1726882661.03100: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.03114: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.03118: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.03131: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.04846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.06828: done with get_vars() 30529 1726882661.06857: variable 'ansible_search_path' from source: unknown 30529 1726882661.06859: variable 'ansible_search_path' from source: unknown 30529 1726882661.07031: variable 'omit' from source: magic vars 30529 1726882661.07082: variable 'omit' from source: magic vars 30529 1726882661.07100: variable 'omit' from source: magic vars 30529 1726882661.07107: we have included files to process 30529 1726882661.07109: generating all_blocks data 30529 1726882661.07111: done generating all_blocks data 30529 1726882661.07116: processing included file: fedora.linux_system_roles.network 30529 1726882661.07136: in VariableManager get_vars() 30529 1726882661.07152: done with get_vars() 30529 1726882661.07196: in VariableManager get_vars() 30529 1726882661.07218: done with get_vars() 30529 1726882661.07264: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882661.07407: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882661.07492: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882661.07971: in VariableManager get_vars() 30529 1726882661.07999: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882661.10036: iterating over new_blocks loaded from include file 30529 1726882661.10039: in VariableManager get_vars() 30529 1726882661.10056: done with get_vars() 30529 1726882661.10058: filtering new block on tags 30529 1726882661.10367: done filtering new block on tags 30529 1726882661.10371: in VariableManager get_vars() 30529 1726882661.10387: done with get_vars() 30529 1726882661.10388: filtering new block on tags 30529 1726882661.10406: done filtering new block on tags 30529 1726882661.10408: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882661.10414: extending task lists for all hosts with included blocks 30529 1726882661.10526: done extending task lists 30529 1726882661.10528: done processing included files 30529 1726882661.10529: results queue empty 30529 1726882661.10530: checking for any_errors_fatal 30529 1726882661.10534: done checking for any_errors_fatal 30529 1726882661.10535: checking for max_fail_percentage 30529 1726882661.10537: done checking for max_fail_percentage 30529 1726882661.10538: checking to see if all hosts have failed and the running result is not ok 30529 1726882661.10539: done checking to see if all hosts have failed 30529 1726882661.10539: getting the remaining hosts for this loop 30529 1726882661.10541: done getting the remaining hosts for this loop 30529 1726882661.10543: getting the next task for host managed_node1 30529 1726882661.10553: done getting next task for host managed_node1 30529 1726882661.10556: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882661.10559: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882661.10570: getting variables 30529 1726882661.10571: in VariableManager get_vars() 30529 1726882661.10585: Calling all_inventory to load vars for managed_node1 30529 1726882661.10588: Calling groups_inventory to load vars for managed_node1 30529 1726882661.10590: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.10597: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.10599: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.10602: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.12181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.14674: done with get_vars() 30529 1726882661.14708: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:41 -0400 (0:00:00.137) 0:01:15.174 ****** 30529 1726882661.14810: entering _queue_task() for managed_node1/include_tasks 30529 1726882661.15442: worker is 1 (out of 1 available) 30529 1726882661.15453: exiting _queue_task() for managed_node1/include_tasks 30529 1726882661.15466: done queuing things up, now waiting for results queue to drain 30529 1726882661.15468: waiting for pending results... 30529 1726882661.15810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882661.15964: in run() - task 12673a56-9f93-b0f1-edc0-0000000019bf 30529 1726882661.15987: variable 'ansible_search_path' from source: unknown 30529 1726882661.16123: variable 'ansible_search_path' from source: unknown 30529 1726882661.16127: calling self._execute() 30529 1726882661.16186: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.16203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.16221: variable 'omit' from source: magic vars 30529 1726882661.16773: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.16797: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.16826: _execute() done 30529 1726882661.16835: dumping result to json 30529 1726882661.16842: done dumping result, returning 30529 1726882661.16854: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-0000000019bf] 30529 1726882661.16863: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019bf 30529 1726882661.17089: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019bf 30529 1726882661.17196: WORKER PROCESS EXITING 30529 1726882661.17256: no more pending results, returning what we have 30529 1726882661.17261: in VariableManager get_vars() 30529 1726882661.17324: Calling all_inventory to load vars for managed_node1 30529 1726882661.17327: Calling groups_inventory to load vars for managed_node1 30529 1726882661.17330: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.17343: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.17346: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.17348: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.20874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.25102: done with get_vars() 30529 1726882661.25138: variable 'ansible_search_path' from source: unknown 30529 1726882661.25140: variable 'ansible_search_path' from source: unknown 30529 1726882661.25192: we have included files to process 30529 1726882661.25195: generating all_blocks data 30529 1726882661.25197: done generating all_blocks data 30529 1726882661.25202: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882661.25203: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882661.25206: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882661.26860: done processing included file 30529 1726882661.26862: iterating over new_blocks loaded from include file 30529 1726882661.26864: in VariableManager get_vars() 30529 1726882661.26894: done with get_vars() 30529 1726882661.26896: filtering new block on tags 30529 1726882661.26928: done filtering new block on tags 30529 1726882661.26931: in VariableManager get_vars() 30529 1726882661.26954: done with get_vars() 30529 1726882661.26955: filtering new block on tags 30529 1726882661.27082: done filtering new block on tags 30529 1726882661.27085: in VariableManager get_vars() 30529 1726882661.27218: done with get_vars() 30529 1726882661.27220: filtering new block on tags 30529 1726882661.27262: done filtering new block on tags 30529 1726882661.27265: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882661.27270: extending task lists for all hosts with included blocks 30529 1726882661.29972: done extending task lists 30529 1726882661.29973: done processing included files 30529 1726882661.29974: results queue empty 30529 1726882661.29975: checking for any_errors_fatal 30529 1726882661.29978: done checking for any_errors_fatal 30529 1726882661.29978: checking for max_fail_percentage 30529 1726882661.29980: done checking for max_fail_percentage 30529 1726882661.29980: checking to see if all hosts have failed and the running result is not ok 30529 1726882661.29981: done checking to see if all hosts have failed 30529 1726882661.29982: getting the remaining hosts for this loop 30529 1726882661.29983: done getting the remaining hosts for this loop 30529 1726882661.29986: getting the next task for host managed_node1 30529 1726882661.29996: done getting next task for host managed_node1 30529 1726882661.29999: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882661.30003: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882661.30020: getting variables 30529 1726882661.30025: in VariableManager get_vars() 30529 1726882661.30040: Calling all_inventory to load vars for managed_node1 30529 1726882661.30042: Calling groups_inventory to load vars for managed_node1 30529 1726882661.30044: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.30049: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.30052: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.30054: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.31930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.35364: done with get_vars() 30529 1726882661.35396: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:41 -0400 (0:00:00.207) 0:01:15.382 ****** 30529 1726882661.35596: entering _queue_task() for managed_node1/setup 30529 1726882661.36361: worker is 1 (out of 1 available) 30529 1726882661.36374: exiting _queue_task() for managed_node1/setup 30529 1726882661.36394: done queuing things up, now waiting for results queue to drain 30529 1726882661.36397: waiting for pending results... 30529 1726882661.37142: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882661.37403: in run() - task 12673a56-9f93-b0f1-edc0-000000001a16 30529 1726882661.37469: variable 'ansible_search_path' from source: unknown 30529 1726882661.37478: variable 'ansible_search_path' from source: unknown 30529 1726882661.37604: calling self._execute() 30529 1726882661.37824: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.37835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.37848: variable 'omit' from source: magic vars 30529 1726882661.38757: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.38761: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.39229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882661.44174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882661.44600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882661.44777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882661.44781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882661.44999: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882661.45117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882661.45241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882661.45273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882661.45471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882661.45495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882661.45660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882661.45696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882661.45778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882661.45829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882661.45913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882661.46400: variable '__network_required_facts' from source: role '' defaults 30529 1726882661.46403: variable 'ansible_facts' from source: unknown 30529 1726882661.48112: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882661.48121: when evaluation is False, skipping this task 30529 1726882661.48128: _execute() done 30529 1726882661.48136: dumping result to json 30529 1726882661.48146: done dumping result, returning 30529 1726882661.48158: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000001a16] 30529 1726882661.48252: sending task result for task 12673a56-9f93-b0f1-edc0-000000001a16 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882661.48460: no more pending results, returning what we have 30529 1726882661.48466: results queue empty 30529 1726882661.48467: checking for any_errors_fatal 30529 1726882661.48469: done checking for any_errors_fatal 30529 1726882661.48470: checking for max_fail_percentage 30529 1726882661.48472: done checking for max_fail_percentage 30529 1726882661.48473: checking to see if all hosts have failed and the running result is not ok 30529 1726882661.48474: done checking to see if all hosts have failed 30529 1726882661.48475: getting the remaining hosts for this loop 30529 1726882661.48477: done getting the remaining hosts for this loop 30529 1726882661.48481: getting the next task for host managed_node1 30529 1726882661.48497: done getting next task for host managed_node1 30529 1726882661.48502: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882661.48509: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882661.48534: getting variables 30529 1726882661.48536: in VariableManager get_vars() 30529 1726882661.48579: Calling all_inventory to load vars for managed_node1 30529 1726882661.48582: Calling groups_inventory to load vars for managed_node1 30529 1726882661.48584: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.48905: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.48910: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.48915: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.49599: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001a16 30529 1726882661.49609: WORKER PROCESS EXITING 30529 1726882661.51285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.53456: done with get_vars() 30529 1726882661.53483: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:41 -0400 (0:00:00.181) 0:01:15.563 ****** 30529 1726882661.53711: entering _queue_task() for managed_node1/stat 30529 1726882661.54498: worker is 1 (out of 1 available) 30529 1726882661.54511: exiting _queue_task() for managed_node1/stat 30529 1726882661.54525: done queuing things up, now waiting for results queue to drain 30529 1726882661.54527: waiting for pending results... 30529 1726882661.55081: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882661.55364: in run() - task 12673a56-9f93-b0f1-edc0-000000001a18 30529 1726882661.55521: variable 'ansible_search_path' from source: unknown 30529 1726882661.55529: variable 'ansible_search_path' from source: unknown 30529 1726882661.55572: calling self._execute() 30529 1726882661.55695: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.55837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.55854: variable 'omit' from source: magic vars 30529 1726882661.56657: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.56669: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.56926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882661.57121: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882661.57174: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882661.57210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882661.57246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882661.57333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882661.57583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882661.57590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882661.57594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882661.57596: variable '__network_is_ostree' from source: set_fact 30529 1726882661.57598: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882661.57600: when evaluation is False, skipping this task 30529 1726882661.57601: _execute() done 30529 1726882661.57603: dumping result to json 30529 1726882661.57605: done dumping result, returning 30529 1726882661.57607: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000001a18] 30529 1726882661.57609: sending task result for task 12673a56-9f93-b0f1-edc0-000000001a18 30529 1726882661.57672: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001a18 30529 1726882661.57675: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882661.57742: no more pending results, returning what we have 30529 1726882661.57747: results queue empty 30529 1726882661.57748: checking for any_errors_fatal 30529 1726882661.57757: done checking for any_errors_fatal 30529 1726882661.57758: checking for max_fail_percentage 30529 1726882661.57759: done checking for max_fail_percentage 30529 1726882661.57760: checking to see if all hosts have failed and the running result is not ok 30529 1726882661.57762: done checking to see if all hosts have failed 30529 1726882661.57762: getting the remaining hosts for this loop 30529 1726882661.57764: done getting the remaining hosts for this loop 30529 1726882661.57768: getting the next task for host managed_node1 30529 1726882661.57778: done getting next task for host managed_node1 30529 1726882661.57781: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882661.57787: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882661.58015: getting variables 30529 1726882661.58017: in VariableManager get_vars() 30529 1726882661.58055: Calling all_inventory to load vars for managed_node1 30529 1726882661.58058: Calling groups_inventory to load vars for managed_node1 30529 1726882661.58061: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.58069: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.58072: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.58075: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.65167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.67264: done with get_vars() 30529 1726882661.67413: done getting variables 30529 1726882661.67463: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:41 -0400 (0:00:00.137) 0:01:15.701 ****** 30529 1726882661.67529: entering _queue_task() for managed_node1/set_fact 30529 1726882661.68047: worker is 1 (out of 1 available) 30529 1726882661.68060: exiting _queue_task() for managed_node1/set_fact 30529 1726882661.68073: done queuing things up, now waiting for results queue to drain 30529 1726882661.68076: waiting for pending results... 30529 1726882661.68479: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882661.68725: in run() - task 12673a56-9f93-b0f1-edc0-000000001a19 30529 1726882661.68732: variable 'ansible_search_path' from source: unknown 30529 1726882661.68737: variable 'ansible_search_path' from source: unknown 30529 1726882661.68832: calling self._execute() 30529 1726882661.68898: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.68905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.68917: variable 'omit' from source: magic vars 30529 1726882661.69356: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.69371: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.69581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882661.69981: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882661.70027: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882661.70271: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882661.70275: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882661.70278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882661.70498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882661.70502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882661.70505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882661.70509: variable '__network_is_ostree' from source: set_fact 30529 1726882661.70512: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882661.70515: when evaluation is False, skipping this task 30529 1726882661.70518: _execute() done 30529 1726882661.70520: dumping result to json 30529 1726882661.70523: done dumping result, returning 30529 1726882661.70530: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000001a19] 30529 1726882661.70532: sending task result for task 12673a56-9f93-b0f1-edc0-000000001a19 30529 1726882661.70612: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001a19 30529 1726882661.70616: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882661.70683: no more pending results, returning what we have 30529 1726882661.70687: results queue empty 30529 1726882661.70688: checking for any_errors_fatal 30529 1726882661.70699: done checking for any_errors_fatal 30529 1726882661.70700: checking for max_fail_percentage 30529 1726882661.70702: done checking for max_fail_percentage 30529 1726882661.70703: checking to see if all hosts have failed and the running result is not ok 30529 1726882661.70704: done checking to see if all hosts have failed 30529 1726882661.70705: getting the remaining hosts for this loop 30529 1726882661.70710: done getting the remaining hosts for this loop 30529 1726882661.70714: getting the next task for host managed_node1 30529 1726882661.70732: done getting next task for host managed_node1 30529 1726882661.70739: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882661.70745: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882661.70772: getting variables 30529 1726882661.70774: in VariableManager get_vars() 30529 1726882661.70969: Calling all_inventory to load vars for managed_node1 30529 1726882661.70972: Calling groups_inventory to load vars for managed_node1 30529 1726882661.70975: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882661.70987: Calling all_plugins_play to load vars for managed_node1 30529 1726882661.70991: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882661.70996: Calling groups_plugins_play to load vars for managed_node1 30529 1726882661.72969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882661.77246: done with get_vars() 30529 1726882661.77276: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:41 -0400 (0:00:00.099) 0:01:15.801 ****** 30529 1726882661.77520: entering _queue_task() for managed_node1/service_facts 30529 1726882661.78096: worker is 1 (out of 1 available) 30529 1726882661.78109: exiting _queue_task() for managed_node1/service_facts 30529 1726882661.78122: done queuing things up, now waiting for results queue to drain 30529 1726882661.78124: waiting for pending results... 30529 1726882661.78355: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882661.78875: in run() - task 12673a56-9f93-b0f1-edc0-000000001a1b 30529 1726882661.78880: variable 'ansible_search_path' from source: unknown 30529 1726882661.78883: variable 'ansible_search_path' from source: unknown 30529 1726882661.78885: calling self._execute() 30529 1726882661.78912: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.78916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.78926: variable 'omit' from source: magic vars 30529 1726882661.79346: variable 'ansible_distribution_major_version' from source: facts 30529 1726882661.79357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882661.79364: variable 'omit' from source: magic vars 30529 1726882661.79526: variable 'omit' from source: magic vars 30529 1726882661.79529: variable 'omit' from source: magic vars 30529 1726882661.79542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882661.79577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882661.79603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882661.79634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882661.79643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882661.79742: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882661.79746: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.79749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.79804: Set connection var ansible_shell_executable to /bin/sh 30529 1726882661.79808: Set connection var ansible_pipelining to False 30529 1726882661.79811: Set connection var ansible_shell_type to sh 30529 1726882661.79825: Set connection var ansible_timeout to 10 30529 1726882661.79840: Set connection var ansible_connection to ssh 30529 1726882661.79849: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882661.79869: variable 'ansible_shell_executable' from source: unknown 30529 1726882661.79873: variable 'ansible_connection' from source: unknown 30529 1726882661.79876: variable 'ansible_module_compression' from source: unknown 30529 1726882661.79878: variable 'ansible_shell_type' from source: unknown 30529 1726882661.79881: variable 'ansible_shell_executable' from source: unknown 30529 1726882661.79883: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882661.79885: variable 'ansible_pipelining' from source: unknown 30529 1726882661.79887: variable 'ansible_timeout' from source: unknown 30529 1726882661.79897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882661.80157: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882661.80176: variable 'omit' from source: magic vars 30529 1726882661.80182: starting attempt loop 30529 1726882661.80185: running the handler 30529 1726882661.80206: _low_level_execute_command(): starting 30529 1726882661.80213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882661.81014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882661.81129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882661.81134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882661.81136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882661.81269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882661.81338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882661.83075: stdout chunk (state=3): >>>/root <<< 30529 1726882661.83148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882661.83151: stdout chunk (state=3): >>><<< 30529 1726882661.83298: stderr chunk (state=3): >>><<< 30529 1726882661.83302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882661.83306: _low_level_execute_command(): starting 30529 1726882661.83309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587 `" && echo ansible-tmp-1726882661.8318586-34093-136787164742587="` echo /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587 `" ) && sleep 0' 30529 1726882661.84575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882661.84591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882661.84691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882661.84817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882661.84923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882661.86803: stdout chunk (state=3): >>>ansible-tmp-1726882661.8318586-34093-136787164742587=/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587 <<< 30529 1726882661.86943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882661.86954: stdout chunk (state=3): >>><<< 30529 1726882661.86967: stderr chunk (state=3): >>><<< 30529 1726882661.86988: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882661.8318586-34093-136787164742587=/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882661.87145: variable 'ansible_module_compression' from source: unknown 30529 1726882661.87400: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882661.87404: variable 'ansible_facts' from source: unknown 30529 1726882661.87638: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py 30529 1726882661.88020: Sending initial data 30529 1726882661.88030: Sent initial data (162 bytes) 30529 1726882661.88904: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882661.89200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882661.89238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882661.89274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882661.90802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882661.90830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882661.90889: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpxwomnm2u /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py <<< 30529 1726882661.90901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py" <<< 30529 1726882661.91128: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpxwomnm2u" to remote "/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py" <<< 30529 1726882661.92170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882661.92206: stderr chunk (state=3): >>><<< 30529 1726882661.92220: stdout chunk (state=3): >>><<< 30529 1726882661.92244: done transferring module to remote 30529 1726882661.92265: _low_level_execute_command(): starting 30529 1726882661.92274: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/ /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py && sleep 0' 30529 1726882661.92903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882661.92916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882661.93008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882661.93012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882661.93060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882661.93079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882661.93138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882661.94933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882661.94937: stdout chunk (state=3): >>><<< 30529 1726882661.94943: stderr chunk (state=3): >>><<< 30529 1726882661.94981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882661.94989: _low_level_execute_command(): starting 30529 1726882661.94992: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/AnsiballZ_service_facts.py && sleep 0' 30529 1726882661.95699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882661.95703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882661.95705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882661.95707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882661.95709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882661.95711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882661.95749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882661.95761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882661.95820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.46862: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30529 1726882663.46876: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882663.48603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882663.48606: stdout chunk (state=3): >>><<< 30529 1726882663.48609: stderr chunk (state=3): >>><<< 30529 1726882663.48613: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882663.51155: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882663.51160: _low_level_execute_command(): starting 30529 1726882663.51163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882661.8318586-34093-136787164742587/ > /dev/null 2>&1 && sleep 0' 30529 1726882663.52801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882663.52821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882663.53099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882663.53102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.53105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.53148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.54898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882663.54901: stdout chunk (state=3): >>><<< 30529 1726882663.54911: stderr chunk (state=3): >>><<< 30529 1726882663.54940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882663.54946: handler run complete 30529 1726882663.55671: variable 'ansible_facts' from source: unknown 30529 1726882663.56230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882663.57868: variable 'ansible_facts' from source: unknown 30529 1726882663.58132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882663.58661: attempt loop complete, returning result 30529 1726882663.58667: _execute() done 30529 1726882663.58669: dumping result to json 30529 1726882663.58722: done dumping result, returning 30529 1726882663.58732: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000001a1b] 30529 1726882663.58735: sending task result for task 12673a56-9f93-b0f1-edc0-000000001a1b 30529 1726882663.61476: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001a1b 30529 1726882663.61480: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882663.61542: no more pending results, returning what we have 30529 1726882663.61545: results queue empty 30529 1726882663.61546: checking for any_errors_fatal 30529 1726882663.61549: done checking for any_errors_fatal 30529 1726882663.61550: checking for max_fail_percentage 30529 1726882663.61551: done checking for max_fail_percentage 30529 1726882663.61552: checking to see if all hosts have failed and the running result is not ok 30529 1726882663.61553: done checking to see if all hosts have failed 30529 1726882663.61553: getting the remaining hosts for this loop 30529 1726882663.61555: done getting the remaining hosts for this loop 30529 1726882663.61558: getting the next task for host managed_node1 30529 1726882663.61564: done getting next task for host managed_node1 30529 1726882663.61566: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882663.61575: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882663.61584: getting variables 30529 1726882663.61586: in VariableManager get_vars() 30529 1726882663.61617: Calling all_inventory to load vars for managed_node1 30529 1726882663.61620: Calling groups_inventory to load vars for managed_node1 30529 1726882663.61622: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882663.61632: Calling all_plugins_play to load vars for managed_node1 30529 1726882663.61635: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882663.61638: Calling groups_plugins_play to load vars for managed_node1 30529 1726882663.64331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882663.66563: done with get_vars() 30529 1726882663.66586: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:43 -0400 (0:00:01.893) 0:01:17.694 ****** 30529 1726882663.66888: entering _queue_task() for managed_node1/package_facts 30529 1726882663.67953: worker is 1 (out of 1 available) 30529 1726882663.67963: exiting _queue_task() for managed_node1/package_facts 30529 1726882663.67975: done queuing things up, now waiting for results queue to drain 30529 1726882663.67976: waiting for pending results... 30529 1726882663.68239: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882663.68400: in run() - task 12673a56-9f93-b0f1-edc0-000000001a1c 30529 1726882663.68405: variable 'ansible_search_path' from source: unknown 30529 1726882663.68408: variable 'ansible_search_path' from source: unknown 30529 1726882663.68444: calling self._execute() 30529 1726882663.68598: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882663.68603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882663.68605: variable 'omit' from source: magic vars 30529 1726882663.68936: variable 'ansible_distribution_major_version' from source: facts 30529 1726882663.68947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882663.68954: variable 'omit' from source: magic vars 30529 1726882663.69198: variable 'omit' from source: magic vars 30529 1726882663.69204: variable 'omit' from source: magic vars 30529 1726882663.69206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882663.69209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882663.69212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882663.69215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882663.69224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882663.69261: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882663.69265: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882663.69267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882663.69383: Set connection var ansible_shell_executable to /bin/sh 30529 1726882663.69387: Set connection var ansible_pipelining to False 30529 1726882663.69392: Set connection var ansible_shell_type to sh 30529 1726882663.69402: Set connection var ansible_timeout to 10 30529 1726882663.69406: Set connection var ansible_connection to ssh 30529 1726882663.69440: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882663.69443: variable 'ansible_shell_executable' from source: unknown 30529 1726882663.69446: variable 'ansible_connection' from source: unknown 30529 1726882663.69449: variable 'ansible_module_compression' from source: unknown 30529 1726882663.69451: variable 'ansible_shell_type' from source: unknown 30529 1726882663.69453: variable 'ansible_shell_executable' from source: unknown 30529 1726882663.69455: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882663.69458: variable 'ansible_pipelining' from source: unknown 30529 1726882663.69460: variable 'ansible_timeout' from source: unknown 30529 1726882663.69462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882663.69658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882663.69663: variable 'omit' from source: magic vars 30529 1726882663.69665: starting attempt loop 30529 1726882663.69668: running the handler 30529 1726882663.69768: _low_level_execute_command(): starting 30529 1726882663.69876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882663.71600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882663.72021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882663.72025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882663.72028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.72232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.72309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.73945: stdout chunk (state=3): >>>/root <<< 30529 1726882663.74137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882663.74141: stderr chunk (state=3): >>><<< 30529 1726882663.74153: stdout chunk (state=3): >>><<< 30529 1726882663.74218: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882663.74230: _low_level_execute_command(): starting 30529 1726882663.74237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602 `" && echo ansible-tmp-1726882663.742165-34196-126662699673602="` echo /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602 `" ) && sleep 0' 30529 1726882663.75495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882663.75499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882663.75779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882663.75795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882663.75798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.75820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.75925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.77810: stdout chunk (state=3): >>>ansible-tmp-1726882663.742165-34196-126662699673602=/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602 <<< 30529 1726882663.77879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882663.77895: stdout chunk (state=3): >>><<< 30529 1726882663.77907: stderr chunk (state=3): >>><<< 30529 1726882663.77924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882663.742165-34196-126662699673602=/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882663.78043: variable 'ansible_module_compression' from source: unknown 30529 1726882663.78399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882663.78402: variable 'ansible_facts' from source: unknown 30529 1726882663.78749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py 30529 1726882663.79203: Sending initial data 30529 1726882663.79207: Sent initial data (161 bytes) 30529 1726882663.80692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882663.80698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882663.80991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.81035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.81072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.82590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882663.82692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882663.82740: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqg7gglm2 /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py <<< 30529 1726882663.82743: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py" <<< 30529 1726882663.82794: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqg7gglm2" to remote "/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py" <<< 30529 1726882663.85731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882663.85739: stderr chunk (state=3): >>><<< 30529 1726882663.85742: stdout chunk (state=3): >>><<< 30529 1726882663.85763: done transferring module to remote 30529 1726882663.85774: _low_level_execute_command(): starting 30529 1726882663.85777: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/ /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py && sleep 0' 30529 1726882663.87499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882663.87503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.87519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.87799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882663.89458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882663.89461: stdout chunk (state=3): >>><<< 30529 1726882663.89464: stderr chunk (state=3): >>><<< 30529 1726882663.89684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882663.89698: _low_level_execute_command(): starting 30529 1726882663.89701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/AnsiballZ_package_facts.py && sleep 0' 30529 1726882663.90444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882663.90453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882663.90509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882663.90569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882663.90590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882663.90600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882663.90728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882664.34391: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30529 1726882664.34509: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882664.36321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882664.36324: stdout chunk (state=3): >>><<< 30529 1726882664.36326: stderr chunk (state=3): >>><<< 30529 1726882664.36334: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882664.38892: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882664.38899: _low_level_execute_command(): starting 30529 1726882664.38902: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882663.742165-34196-126662699673602/ > /dev/null 2>&1 && sleep 0' 30529 1726882664.39584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882664.39618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882664.39697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882664.41523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882664.41585: stderr chunk (state=3): >>><<< 30529 1726882664.41588: stdout chunk (state=3): >>><<< 30529 1726882664.41699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882664.41704: handler run complete 30529 1726882664.42542: variable 'ansible_facts' from source: unknown 30529 1726882664.43122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.45413: variable 'ansible_facts' from source: unknown 30529 1726882664.45972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.46731: attempt loop complete, returning result 30529 1726882664.46749: _execute() done 30529 1726882664.46757: dumping result to json 30529 1726882664.46973: done dumping result, returning 30529 1726882664.46989: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000001a1c] 30529 1726882664.47002: sending task result for task 12673a56-9f93-b0f1-edc0-000000001a1c 30529 1726882664.51140: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001a1c 30529 1726882664.51144: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882664.51304: no more pending results, returning what we have 30529 1726882664.51307: results queue empty 30529 1726882664.51308: checking for any_errors_fatal 30529 1726882664.51314: done checking for any_errors_fatal 30529 1726882664.51315: checking for max_fail_percentage 30529 1726882664.51316: done checking for max_fail_percentage 30529 1726882664.51317: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.51318: done checking to see if all hosts have failed 30529 1726882664.51319: getting the remaining hosts for this loop 30529 1726882664.51320: done getting the remaining hosts for this loop 30529 1726882664.51323: getting the next task for host managed_node1 30529 1726882664.51331: done getting next task for host managed_node1 30529 1726882664.51334: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882664.51339: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.51353: getting variables 30529 1726882664.51354: in VariableManager get_vars() 30529 1726882664.51386: Calling all_inventory to load vars for managed_node1 30529 1726882664.51389: Calling groups_inventory to load vars for managed_node1 30529 1726882664.51392: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.51495: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.51502: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.51507: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.53491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.55368: done with get_vars() 30529 1726882664.55416: done getting variables 30529 1726882664.55534: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:44 -0400 (0:00:00.888) 0:01:18.583 ****** 30529 1726882664.55729: entering _queue_task() for managed_node1/debug 30529 1726882664.56321: worker is 1 (out of 1 available) 30529 1726882664.56333: exiting _queue_task() for managed_node1/debug 30529 1726882664.56343: done queuing things up, now waiting for results queue to drain 30529 1726882664.56345: waiting for pending results... 30529 1726882664.56587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882664.56649: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c0 30529 1726882664.56679: variable 'ansible_search_path' from source: unknown 30529 1726882664.56692: variable 'ansible_search_path' from source: unknown 30529 1726882664.56740: calling self._execute() 30529 1726882664.56867: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.56879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.56924: variable 'omit' from source: magic vars 30529 1726882664.57451: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.57557: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882664.57560: variable 'omit' from source: magic vars 30529 1726882664.57569: variable 'omit' from source: magic vars 30529 1726882664.57677: variable 'network_provider' from source: set_fact 30529 1726882664.57709: variable 'omit' from source: magic vars 30529 1726882664.57753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882664.57812: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882664.57838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882664.57883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882664.57886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882664.57995: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882664.57998: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.58002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.58046: Set connection var ansible_shell_executable to /bin/sh 30529 1726882664.58056: Set connection var ansible_pipelining to False 30529 1726882664.58063: Set connection var ansible_shell_type to sh 30529 1726882664.58103: Set connection var ansible_timeout to 10 30529 1726882664.58106: Set connection var ansible_connection to ssh 30529 1726882664.58109: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882664.58321: variable 'ansible_shell_executable' from source: unknown 30529 1726882664.58324: variable 'ansible_connection' from source: unknown 30529 1726882664.58327: variable 'ansible_module_compression' from source: unknown 30529 1726882664.58329: variable 'ansible_shell_type' from source: unknown 30529 1726882664.58331: variable 'ansible_shell_executable' from source: unknown 30529 1726882664.58333: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.58335: variable 'ansible_pipelining' from source: unknown 30529 1726882664.58337: variable 'ansible_timeout' from source: unknown 30529 1726882664.58339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.58506: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882664.58525: variable 'omit' from source: magic vars 30529 1726882664.58546: starting attempt loop 30529 1726882664.58554: running the handler 30529 1726882664.58611: handler run complete 30529 1726882664.58631: attempt loop complete, returning result 30529 1726882664.58647: _execute() done 30529 1726882664.58655: dumping result to json 30529 1726882664.58663: done dumping result, returning 30529 1726882664.58676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-0000000019c0] 30529 1726882664.58685: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c0 ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882664.58940: no more pending results, returning what we have 30529 1726882664.58944: results queue empty 30529 1726882664.58945: checking for any_errors_fatal 30529 1726882664.58956: done checking for any_errors_fatal 30529 1726882664.58957: checking for max_fail_percentage 30529 1726882664.58959: done checking for max_fail_percentage 30529 1726882664.58960: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.58961: done checking to see if all hosts have failed 30529 1726882664.58962: getting the remaining hosts for this loop 30529 1726882664.59045: done getting the remaining hosts for this loop 30529 1726882664.59052: getting the next task for host managed_node1 30529 1726882664.59061: done getting next task for host managed_node1 30529 1726882664.59065: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882664.59073: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.59097: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c0 30529 1726882664.59100: WORKER PROCESS EXITING 30529 1726882664.59110: getting variables 30529 1726882664.59113: in VariableManager get_vars() 30529 1726882664.59159: Calling all_inventory to load vars for managed_node1 30529 1726882664.59162: Calling groups_inventory to load vars for managed_node1 30529 1726882664.59164: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.59175: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.59177: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.59180: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.62362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.64967: done with get_vars() 30529 1726882664.65027: done getting variables 30529 1726882664.65328: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:44 -0400 (0:00:00.096) 0:01:18.679 ****** 30529 1726882664.65375: entering _queue_task() for managed_node1/fail 30529 1726882664.66207: worker is 1 (out of 1 available) 30529 1726882664.66220: exiting _queue_task() for managed_node1/fail 30529 1726882664.66363: done queuing things up, now waiting for results queue to drain 30529 1726882664.66366: waiting for pending results... 30529 1726882664.66936: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882664.67241: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c1 30529 1726882664.67248: variable 'ansible_search_path' from source: unknown 30529 1726882664.67251: variable 'ansible_search_path' from source: unknown 30529 1726882664.67255: calling self._execute() 30529 1726882664.67357: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.67361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.67365: variable 'omit' from source: magic vars 30529 1726882664.67786: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.67796: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882664.67853: variable 'network_state' from source: role '' defaults 30529 1726882664.67864: Evaluated conditional (network_state != {}): False 30529 1726882664.67867: when evaluation is False, skipping this task 30529 1726882664.67870: _execute() done 30529 1726882664.67873: dumping result to json 30529 1726882664.67876: done dumping result, returning 30529 1726882664.67882: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-0000000019c1] 30529 1726882664.67909: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c1 30529 1726882664.68076: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c1 30529 1726882664.68079: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882664.68137: no more pending results, returning what we have 30529 1726882664.68142: results queue empty 30529 1726882664.68143: checking for any_errors_fatal 30529 1726882664.68151: done checking for any_errors_fatal 30529 1726882664.68152: checking for max_fail_percentage 30529 1726882664.68154: done checking for max_fail_percentage 30529 1726882664.68155: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.68156: done checking to see if all hosts have failed 30529 1726882664.68157: getting the remaining hosts for this loop 30529 1726882664.68159: done getting the remaining hosts for this loop 30529 1726882664.68163: getting the next task for host managed_node1 30529 1726882664.68174: done getting next task for host managed_node1 30529 1726882664.68178: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882664.68186: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.68221: getting variables 30529 1726882664.68224: in VariableManager get_vars() 30529 1726882664.68268: Calling all_inventory to load vars for managed_node1 30529 1726882664.68271: Calling groups_inventory to load vars for managed_node1 30529 1726882664.68273: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.68290: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.68667: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.68673: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.70488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.72037: done with get_vars() 30529 1726882664.72061: done getting variables 30529 1726882664.72120: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:44 -0400 (0:00:00.067) 0:01:18.747 ****** 30529 1726882664.72153: entering _queue_task() for managed_node1/fail 30529 1726882664.72454: worker is 1 (out of 1 available) 30529 1726882664.72467: exiting _queue_task() for managed_node1/fail 30529 1726882664.72479: done queuing things up, now waiting for results queue to drain 30529 1726882664.72480: waiting for pending results... 30529 1726882664.72879: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882664.72913: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c2 30529 1726882664.72977: variable 'ansible_search_path' from source: unknown 30529 1726882664.72980: variable 'ansible_search_path' from source: unknown 30529 1726882664.72984: calling self._execute() 30529 1726882664.73200: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.73204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.73206: variable 'omit' from source: magic vars 30529 1726882664.73478: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.73494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882664.73617: variable 'network_state' from source: role '' defaults 30529 1726882664.73627: Evaluated conditional (network_state != {}): False 30529 1726882664.73631: when evaluation is False, skipping this task 30529 1726882664.73634: _execute() done 30529 1726882664.73636: dumping result to json 30529 1726882664.73639: done dumping result, returning 30529 1726882664.73645: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-0000000019c2] 30529 1726882664.73648: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c2 30529 1726882664.73751: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c2 30529 1726882664.73755: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882664.73809: no more pending results, returning what we have 30529 1726882664.73813: results queue empty 30529 1726882664.73814: checking for any_errors_fatal 30529 1726882664.73824: done checking for any_errors_fatal 30529 1726882664.73825: checking for max_fail_percentage 30529 1726882664.73827: done checking for max_fail_percentage 30529 1726882664.73828: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.73829: done checking to see if all hosts have failed 30529 1726882664.73829: getting the remaining hosts for this loop 30529 1726882664.73831: done getting the remaining hosts for this loop 30529 1726882664.73836: getting the next task for host managed_node1 30529 1726882664.73845: done getting next task for host managed_node1 30529 1726882664.73850: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882664.73856: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.73885: getting variables 30529 1726882664.73887: in VariableManager get_vars() 30529 1726882664.73938: Calling all_inventory to load vars for managed_node1 30529 1726882664.73941: Calling groups_inventory to load vars for managed_node1 30529 1726882664.73943: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.73956: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.73960: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.73963: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.75535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.77038: done with get_vars() 30529 1726882664.77060: done getting variables 30529 1726882664.77123: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:44 -0400 (0:00:00.050) 0:01:18.797 ****** 30529 1726882664.77161: entering _queue_task() for managed_node1/fail 30529 1726882664.77471: worker is 1 (out of 1 available) 30529 1726882664.77484: exiting _queue_task() for managed_node1/fail 30529 1726882664.77499: done queuing things up, now waiting for results queue to drain 30529 1726882664.77501: waiting for pending results... 30529 1726882664.77797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882664.77943: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c3 30529 1726882664.77955: variable 'ansible_search_path' from source: unknown 30529 1726882664.77958: variable 'ansible_search_path' from source: unknown 30529 1726882664.78066: calling self._execute() 30529 1726882664.78091: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.78097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.78104: variable 'omit' from source: magic vars 30529 1726882664.78503: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.78511: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882664.78696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882664.81022: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882664.81058: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882664.81104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882664.81140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882664.81167: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882664.81255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.81307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.81330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.81371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.81387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.81516: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.81524: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882664.81636: variable 'ansible_distribution' from source: facts 30529 1726882664.81640: variable '__network_rh_distros' from source: role '' defaults 30529 1726882664.81643: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882664.81949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.81959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.81961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.81995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.82008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.82058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.82080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.82108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.82146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.82164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.82202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.82231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.82255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.82292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.82310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.82700: variable 'network_connections' from source: include params 30529 1726882664.82704: variable 'interface' from source: play vars 30529 1726882664.82706: variable 'interface' from source: play vars 30529 1726882664.82714: variable 'network_state' from source: role '' defaults 30529 1726882664.82791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882664.82946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882664.82990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882664.83018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882664.83046: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882664.83097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882664.83118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882664.83143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.83168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882664.83197: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882664.83201: when evaluation is False, skipping this task 30529 1726882664.83204: _execute() done 30529 1726882664.83206: dumping result to json 30529 1726882664.83208: done dumping result, returning 30529 1726882664.83246: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-0000000019c3] 30529 1726882664.83249: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c3 30529 1726882664.83416: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c3 30529 1726882664.83420: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882664.83466: no more pending results, returning what we have 30529 1726882664.83470: results queue empty 30529 1726882664.83471: checking for any_errors_fatal 30529 1726882664.83477: done checking for any_errors_fatal 30529 1726882664.83478: checking for max_fail_percentage 30529 1726882664.83480: done checking for max_fail_percentage 30529 1726882664.83481: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.83482: done checking to see if all hosts have failed 30529 1726882664.83482: getting the remaining hosts for this loop 30529 1726882664.83484: done getting the remaining hosts for this loop 30529 1726882664.83488: getting the next task for host managed_node1 30529 1726882664.83499: done getting next task for host managed_node1 30529 1726882664.83502: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882664.83508: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.83534: getting variables 30529 1726882664.83536: in VariableManager get_vars() 30529 1726882664.83580: Calling all_inventory to load vars for managed_node1 30529 1726882664.83583: Calling groups_inventory to load vars for managed_node1 30529 1726882664.83586: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.83785: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.83789: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.83794: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.85131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882664.86827: done with get_vars() 30529 1726882664.86849: done getting variables 30529 1726882664.86913: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:44 -0400 (0:00:00.097) 0:01:18.895 ****** 30529 1726882664.86951: entering _queue_task() for managed_node1/dnf 30529 1726882664.87304: worker is 1 (out of 1 available) 30529 1726882664.87317: exiting _queue_task() for managed_node1/dnf 30529 1726882664.87330: done queuing things up, now waiting for results queue to drain 30529 1726882664.87332: waiting for pending results... 30529 1726882664.87762: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882664.87768: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c4 30529 1726882664.87771: variable 'ansible_search_path' from source: unknown 30529 1726882664.87774: variable 'ansible_search_path' from source: unknown 30529 1726882664.88054: calling self._execute() 30529 1726882664.88059: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882664.88062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882664.88064: variable 'omit' from source: magic vars 30529 1726882664.88631: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.88642: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882664.89065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882664.93663: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882664.93840: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882664.93876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882664.94100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882664.94104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882664.94180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.94402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.94406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.94463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.94481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.94727: variable 'ansible_distribution' from source: facts 30529 1726882664.94731: variable 'ansible_distribution_major_version' from source: facts 30529 1726882664.94743: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882664.94947: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882664.94995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.95021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.95054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.95082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.95101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.95145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.95168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.95195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.95383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.95386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.95392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882664.95396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882664.95399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.95401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882664.95403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882664.95536: variable 'network_connections' from source: include params 30529 1726882664.95552: variable 'interface' from source: play vars 30529 1726882664.95615: variable 'interface' from source: play vars 30529 1726882664.95694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882664.95898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882664.95932: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882664.95947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882664.95976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882664.96037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882664.96072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882664.96201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882664.96204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882664.96207: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882664.96501: variable 'network_connections' from source: include params 30529 1726882664.96504: variable 'interface' from source: play vars 30529 1726882664.96507: variable 'interface' from source: play vars 30529 1726882664.96513: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882664.96516: when evaluation is False, skipping this task 30529 1726882664.96518: _execute() done 30529 1726882664.96520: dumping result to json 30529 1726882664.96533: done dumping result, returning 30529 1726882664.96538: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000019c4] 30529 1726882664.96541: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c4 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882664.96694: no more pending results, returning what we have 30529 1726882664.96698: results queue empty 30529 1726882664.96700: checking for any_errors_fatal 30529 1726882664.96708: done checking for any_errors_fatal 30529 1726882664.96709: checking for max_fail_percentage 30529 1726882664.96711: done checking for max_fail_percentage 30529 1726882664.96712: checking to see if all hosts have failed and the running result is not ok 30529 1726882664.96713: done checking to see if all hosts have failed 30529 1726882664.96714: getting the remaining hosts for this loop 30529 1726882664.96716: done getting the remaining hosts for this loop 30529 1726882664.96720: getting the next task for host managed_node1 30529 1726882664.96729: done getting next task for host managed_node1 30529 1726882664.96733: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882664.96739: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882664.96769: getting variables 30529 1726882664.96771: in VariableManager get_vars() 30529 1726882664.96923: Calling all_inventory to load vars for managed_node1 30529 1726882664.96926: Calling groups_inventory to load vars for managed_node1 30529 1726882664.96928: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882664.96940: Calling all_plugins_play to load vars for managed_node1 30529 1726882664.96943: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882664.96947: Calling groups_plugins_play to load vars for managed_node1 30529 1726882664.97463: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c4 30529 1726882664.97467: WORKER PROCESS EXITING 30529 1726882664.98912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.00865: done with get_vars() 30529 1726882665.00889: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882665.00969: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:45 -0400 (0:00:00.140) 0:01:19.036 ****** 30529 1726882665.01006: entering _queue_task() for managed_node1/yum 30529 1726882665.01335: worker is 1 (out of 1 available) 30529 1726882665.01349: exiting _queue_task() for managed_node1/yum 30529 1726882665.01361: done queuing things up, now waiting for results queue to drain 30529 1726882665.01363: waiting for pending results... 30529 1726882665.01650: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882665.02284: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c5 30529 1726882665.02506: variable 'ansible_search_path' from source: unknown 30529 1726882665.02510: variable 'ansible_search_path' from source: unknown 30529 1726882665.02547: calling self._execute() 30529 1726882665.02643: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.02646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.02654: variable 'omit' from source: magic vars 30529 1726882665.03547: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.03559: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.03963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882665.08669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882665.08877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882665.08881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882665.08903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882665.09200: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882665.09204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.10028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.10057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.10196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.10200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.10410: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.10414: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882665.10418: when evaluation is False, skipping this task 30529 1726882665.10421: _execute() done 30529 1726882665.10423: dumping result to json 30529 1726882665.10426: done dumping result, returning 30529 1726882665.10433: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000019c5] 30529 1726882665.10437: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c5 30529 1726882665.10679: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c5 30529 1726882665.10683: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882665.10744: no more pending results, returning what we have 30529 1726882665.10748: results queue empty 30529 1726882665.10749: checking for any_errors_fatal 30529 1726882665.10756: done checking for any_errors_fatal 30529 1726882665.10757: checking for max_fail_percentage 30529 1726882665.10759: done checking for max_fail_percentage 30529 1726882665.10760: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.10761: done checking to see if all hosts have failed 30529 1726882665.10762: getting the remaining hosts for this loop 30529 1726882665.10764: done getting the remaining hosts for this loop 30529 1726882665.10769: getting the next task for host managed_node1 30529 1726882665.10778: done getting next task for host managed_node1 30529 1726882665.10782: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882665.10788: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.10817: getting variables 30529 1726882665.10819: in VariableManager get_vars() 30529 1726882665.10866: Calling all_inventory to load vars for managed_node1 30529 1726882665.10869: Calling groups_inventory to load vars for managed_node1 30529 1726882665.10872: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.10884: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.10887: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.10891: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.14227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.17465: done with get_vars() 30529 1726882665.17502: done getting variables 30529 1726882665.17565: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:45 -0400 (0:00:00.167) 0:01:19.204 ****** 30529 1726882665.17807: entering _queue_task() for managed_node1/fail 30529 1726882665.18289: worker is 1 (out of 1 available) 30529 1726882665.18506: exiting _queue_task() for managed_node1/fail 30529 1726882665.18519: done queuing things up, now waiting for results queue to drain 30529 1726882665.18521: waiting for pending results... 30529 1726882665.19086: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882665.19180: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c6 30529 1726882665.19461: variable 'ansible_search_path' from source: unknown 30529 1726882665.19465: variable 'ansible_search_path' from source: unknown 30529 1726882665.19468: calling self._execute() 30529 1726882665.19519: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.19529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.19581: variable 'omit' from source: magic vars 30529 1726882665.20573: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.20597: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.20901: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.21401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882665.24200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882665.24241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882665.24278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882665.24314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882665.24344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882665.24428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.24499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.24504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.24546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.24566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.24615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.24636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.24661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.24801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.24816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.24855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.24877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.25018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.25058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.25069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.25471: variable 'network_connections' from source: include params 30529 1726882665.25486: variable 'interface' from source: play vars 30529 1726882665.26001: variable 'interface' from source: play vars 30529 1726882665.26005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882665.26008: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882665.26011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882665.26025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882665.26052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882665.26100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882665.26123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882665.26142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.26167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882665.26218: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882665.26462: variable 'network_connections' from source: include params 30529 1726882665.26467: variable 'interface' from source: play vars 30529 1726882665.26534: variable 'interface' from source: play vars 30529 1726882665.26560: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882665.26564: when evaluation is False, skipping this task 30529 1726882665.26566: _execute() done 30529 1726882665.26569: dumping result to json 30529 1726882665.26571: done dumping result, returning 30529 1726882665.26577: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000019c6] 30529 1726882665.26583: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c6 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882665.26731: no more pending results, returning what we have 30529 1726882665.26734: results queue empty 30529 1726882665.26735: checking for any_errors_fatal 30529 1726882665.26741: done checking for any_errors_fatal 30529 1726882665.26742: checking for max_fail_percentage 30529 1726882665.26744: done checking for max_fail_percentage 30529 1726882665.26745: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.26746: done checking to see if all hosts have failed 30529 1726882665.26746: getting the remaining hosts for this loop 30529 1726882665.26748: done getting the remaining hosts for this loop 30529 1726882665.26752: getting the next task for host managed_node1 30529 1726882665.26760: done getting next task for host managed_node1 30529 1726882665.26763: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882665.26768: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.26801: getting variables 30529 1726882665.26803: in VariableManager get_vars() 30529 1726882665.26844: Calling all_inventory to load vars for managed_node1 30529 1726882665.26847: Calling groups_inventory to load vars for managed_node1 30529 1726882665.26849: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.26858: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.26860: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.26863: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.27399: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c6 30529 1726882665.27403: WORKER PROCESS EXITING 30529 1726882665.28433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.30118: done with get_vars() 30529 1726882665.30150: done getting variables 30529 1726882665.30226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:45 -0400 (0:00:00.124) 0:01:19.328 ****** 30529 1726882665.30274: entering _queue_task() for managed_node1/package 30529 1726882665.30673: worker is 1 (out of 1 available) 30529 1726882665.30804: exiting _queue_task() for managed_node1/package 30529 1726882665.30818: done queuing things up, now waiting for results queue to drain 30529 1726882665.30820: waiting for pending results... 30529 1726882665.31036: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882665.31187: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c7 30529 1726882665.31213: variable 'ansible_search_path' from source: unknown 30529 1726882665.31223: variable 'ansible_search_path' from source: unknown 30529 1726882665.31271: calling self._execute() 30529 1726882665.31385: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.31400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.31414: variable 'omit' from source: magic vars 30529 1726882665.31831: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.31846: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.32111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882665.32368: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882665.32425: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882665.32469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882665.32547: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882665.32671: variable 'network_packages' from source: role '' defaults 30529 1726882665.32873: variable '__network_provider_setup' from source: role '' defaults 30529 1726882665.32876: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882665.32878: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882665.32885: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882665.32950: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882665.33142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882665.35522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882665.35612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882665.35656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882665.35706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882665.35734: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882665.35816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.35844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.35897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.35921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.35939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.35982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.36020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.36294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.36298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.36305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.36531: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882665.36645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.36668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.36695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.36730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.36745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.36834: variable 'ansible_python' from source: facts 30529 1726882665.36849: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882665.36929: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882665.37006: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882665.37128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.37155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.37174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.37215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.37229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.37272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.37296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.37321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.37358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.37373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.37510: variable 'network_connections' from source: include params 30529 1726882665.37516: variable 'interface' from source: play vars 30529 1726882665.37619: variable 'interface' from source: play vars 30529 1726882665.37677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882665.37703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882665.37730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.37902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882665.37906: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.38064: variable 'network_connections' from source: include params 30529 1726882665.38067: variable 'interface' from source: play vars 30529 1726882665.38228: variable 'interface' from source: play vars 30529 1726882665.38232: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882665.38264: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.38553: variable 'network_connections' from source: include params 30529 1726882665.38556: variable 'interface' from source: play vars 30529 1726882665.38622: variable 'interface' from source: play vars 30529 1726882665.38643: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882665.38719: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882665.38997: variable 'network_connections' from source: include params 30529 1726882665.39000: variable 'interface' from source: play vars 30529 1726882665.39141: variable 'interface' from source: play vars 30529 1726882665.39144: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882665.39170: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882665.39176: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882665.39237: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882665.39450: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882665.39951: variable 'network_connections' from source: include params 30529 1726882665.39954: variable 'interface' from source: play vars 30529 1726882665.40010: variable 'interface' from source: play vars 30529 1726882665.40060: variable 'ansible_distribution' from source: facts 30529 1726882665.40063: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.40070: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.40073: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882665.40201: variable 'ansible_distribution' from source: facts 30529 1726882665.40204: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.40210: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.40225: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882665.40798: variable 'ansible_distribution' from source: facts 30529 1726882665.40803: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.40805: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.40807: variable 'network_provider' from source: set_fact 30529 1726882665.40809: variable 'ansible_facts' from source: unknown 30529 1726882665.41100: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882665.41103: when evaluation is False, skipping this task 30529 1726882665.41106: _execute() done 30529 1726882665.41108: dumping result to json 30529 1726882665.41110: done dumping result, returning 30529 1726882665.41119: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-0000000019c7] 30529 1726882665.41125: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c7 30529 1726882665.41225: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c7 30529 1726882665.41228: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882665.41295: no more pending results, returning what we have 30529 1726882665.41299: results queue empty 30529 1726882665.41300: checking for any_errors_fatal 30529 1726882665.41306: done checking for any_errors_fatal 30529 1726882665.41307: checking for max_fail_percentage 30529 1726882665.41309: done checking for max_fail_percentage 30529 1726882665.41309: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.41310: done checking to see if all hosts have failed 30529 1726882665.41311: getting the remaining hosts for this loop 30529 1726882665.41313: done getting the remaining hosts for this loop 30529 1726882665.41316: getting the next task for host managed_node1 30529 1726882665.41326: done getting next task for host managed_node1 30529 1726882665.41329: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882665.41334: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.41358: getting variables 30529 1726882665.41360: in VariableManager get_vars() 30529 1726882665.41407: Calling all_inventory to load vars for managed_node1 30529 1726882665.41409: Calling groups_inventory to load vars for managed_node1 30529 1726882665.41411: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.41421: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.41423: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.41426: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.42985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.44480: done with get_vars() 30529 1726882665.44503: done getting variables 30529 1726882665.44559: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:45 -0400 (0:00:00.143) 0:01:19.472 ****** 30529 1726882665.44599: entering _queue_task() for managed_node1/package 30529 1726882665.44959: worker is 1 (out of 1 available) 30529 1726882665.44973: exiting _queue_task() for managed_node1/package 30529 1726882665.44985: done queuing things up, now waiting for results queue to drain 30529 1726882665.44987: waiting for pending results... 30529 1726882665.45415: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882665.45457: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c8 30529 1726882665.45478: variable 'ansible_search_path' from source: unknown 30529 1726882665.45488: variable 'ansible_search_path' from source: unknown 30529 1726882665.45536: calling self._execute() 30529 1726882665.45639: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.45652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.45667: variable 'omit' from source: magic vars 30529 1726882665.46051: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.46069: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.46190: variable 'network_state' from source: role '' defaults 30529 1726882665.46209: Evaluated conditional (network_state != {}): False 30529 1726882665.46269: when evaluation is False, skipping this task 30529 1726882665.46273: _execute() done 30529 1726882665.46275: dumping result to json 30529 1726882665.46277: done dumping result, returning 30529 1726882665.46280: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000019c8] 30529 1726882665.46283: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c8 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882665.46543: no more pending results, returning what we have 30529 1726882665.46548: results queue empty 30529 1726882665.46549: checking for any_errors_fatal 30529 1726882665.46558: done checking for any_errors_fatal 30529 1726882665.46558: checking for max_fail_percentage 30529 1726882665.46561: done checking for max_fail_percentage 30529 1726882665.46562: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.46563: done checking to see if all hosts have failed 30529 1726882665.46563: getting the remaining hosts for this loop 30529 1726882665.46565: done getting the remaining hosts for this loop 30529 1726882665.46570: getting the next task for host managed_node1 30529 1726882665.46580: done getting next task for host managed_node1 30529 1726882665.46584: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882665.46590: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.46620: getting variables 30529 1726882665.46623: in VariableManager get_vars() 30529 1726882665.46664: Calling all_inventory to load vars for managed_node1 30529 1726882665.46667: Calling groups_inventory to load vars for managed_node1 30529 1726882665.46669: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.46682: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.46685: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.46688: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.46888: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c8 30529 1726882665.46891: WORKER PROCESS EXITING 30529 1726882665.48197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.49892: done with get_vars() 30529 1726882665.49916: done getting variables 30529 1726882665.49978: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:45 -0400 (0:00:00.054) 0:01:19.526 ****** 30529 1726882665.50016: entering _queue_task() for managed_node1/package 30529 1726882665.50367: worker is 1 (out of 1 available) 30529 1726882665.50381: exiting _queue_task() for managed_node1/package 30529 1726882665.50597: done queuing things up, now waiting for results queue to drain 30529 1726882665.50600: waiting for pending results... 30529 1726882665.50691: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882665.50853: in run() - task 12673a56-9f93-b0f1-edc0-0000000019c9 30529 1726882665.50874: variable 'ansible_search_path' from source: unknown 30529 1726882665.50882: variable 'ansible_search_path' from source: unknown 30529 1726882665.50924: calling self._execute() 30529 1726882665.51025: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.51042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.51057: variable 'omit' from source: magic vars 30529 1726882665.51437: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.51454: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.51583: variable 'network_state' from source: role '' defaults 30529 1726882665.51604: Evaluated conditional (network_state != {}): False 30529 1726882665.51611: when evaluation is False, skipping this task 30529 1726882665.51619: _execute() done 30529 1726882665.51700: dumping result to json 30529 1726882665.51703: done dumping result, returning 30529 1726882665.51706: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000019c9] 30529 1726882665.51708: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c9 30529 1726882665.51782: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019c9 30529 1726882665.51785: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882665.51851: no more pending results, returning what we have 30529 1726882665.51856: results queue empty 30529 1726882665.51857: checking for any_errors_fatal 30529 1726882665.51864: done checking for any_errors_fatal 30529 1726882665.51865: checking for max_fail_percentage 30529 1726882665.51868: done checking for max_fail_percentage 30529 1726882665.51869: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.51870: done checking to see if all hosts have failed 30529 1726882665.51871: getting the remaining hosts for this loop 30529 1726882665.51872: done getting the remaining hosts for this loop 30529 1726882665.51876: getting the next task for host managed_node1 30529 1726882665.51887: done getting next task for host managed_node1 30529 1726882665.51890: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882665.51899: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.51929: getting variables 30529 1726882665.51931: in VariableManager get_vars() 30529 1726882665.51975: Calling all_inventory to load vars for managed_node1 30529 1726882665.51978: Calling groups_inventory to load vars for managed_node1 30529 1726882665.51980: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.51992: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.52103: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.52108: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.53594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.55143: done with get_vars() 30529 1726882665.55169: done getting variables 30529 1726882665.55233: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:45 -0400 (0:00:00.052) 0:01:19.578 ****** 30529 1726882665.55274: entering _queue_task() for managed_node1/service 30529 1726882665.55623: worker is 1 (out of 1 available) 30529 1726882665.55636: exiting _queue_task() for managed_node1/service 30529 1726882665.55649: done queuing things up, now waiting for results queue to drain 30529 1726882665.55651: waiting for pending results... 30529 1726882665.56023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882665.56102: in run() - task 12673a56-9f93-b0f1-edc0-0000000019ca 30529 1726882665.56126: variable 'ansible_search_path' from source: unknown 30529 1726882665.56135: variable 'ansible_search_path' from source: unknown 30529 1726882665.56176: calling self._execute() 30529 1726882665.56288: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.56303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.56336: variable 'omit' from source: magic vars 30529 1726882665.56706: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.56767: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.56845: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.57049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882665.59905: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882665.60030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882665.60077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882665.60196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882665.60199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882665.60247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.60322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.60354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.60400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.60431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.60483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.60518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.60544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.60586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.60608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.60735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.60738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.60741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.60748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.60765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.60950: variable 'network_connections' from source: include params 30529 1726882665.60973: variable 'interface' from source: play vars 30529 1726882665.61046: variable 'interface' from source: play vars 30529 1726882665.61122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882665.61295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882665.61341: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882665.61379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882665.61420: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882665.61468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882665.61501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882665.61610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.61613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882665.61620: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882665.61872: variable 'network_connections' from source: include params 30529 1726882665.61883: variable 'interface' from source: play vars 30529 1726882665.61955: variable 'interface' from source: play vars 30529 1726882665.61984: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882665.61995: when evaluation is False, skipping this task 30529 1726882665.62004: _execute() done 30529 1726882665.62011: dumping result to json 30529 1726882665.62019: done dumping result, returning 30529 1726882665.62046: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000019ca] 30529 1726882665.62400: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019ca 30529 1726882665.62474: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019ca 30529 1726882665.62486: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882665.62554: no more pending results, returning what we have 30529 1726882665.62558: results queue empty 30529 1726882665.62559: checking for any_errors_fatal 30529 1726882665.62568: done checking for any_errors_fatal 30529 1726882665.62569: checking for max_fail_percentage 30529 1726882665.62571: done checking for max_fail_percentage 30529 1726882665.62572: checking to see if all hosts have failed and the running result is not ok 30529 1726882665.62573: done checking to see if all hosts have failed 30529 1726882665.62574: getting the remaining hosts for this loop 30529 1726882665.62575: done getting the remaining hosts for this loop 30529 1726882665.62580: getting the next task for host managed_node1 30529 1726882665.62590: done getting next task for host managed_node1 30529 1726882665.62596: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882665.62602: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882665.62629: getting variables 30529 1726882665.62631: in VariableManager get_vars() 30529 1726882665.62674: Calling all_inventory to load vars for managed_node1 30529 1726882665.62676: Calling groups_inventory to load vars for managed_node1 30529 1726882665.62678: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882665.62688: Calling all_plugins_play to load vars for managed_node1 30529 1726882665.62691: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882665.63099: Calling groups_plugins_play to load vars for managed_node1 30529 1726882665.66405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882665.69363: done with get_vars() 30529 1726882665.69605: done getting variables 30529 1726882665.69668: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:45 -0400 (0:00:00.145) 0:01:19.724 ****** 30529 1726882665.69810: entering _queue_task() for managed_node1/service 30529 1726882665.70756: worker is 1 (out of 1 available) 30529 1726882665.70768: exiting _queue_task() for managed_node1/service 30529 1726882665.70781: done queuing things up, now waiting for results queue to drain 30529 1726882665.70782: waiting for pending results... 30529 1726882665.71414: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882665.71682: in run() - task 12673a56-9f93-b0f1-edc0-0000000019cb 30529 1726882665.71706: variable 'ansible_search_path' from source: unknown 30529 1726882665.71899: variable 'ansible_search_path' from source: unknown 30529 1726882665.71903: calling self._execute() 30529 1726882665.71979: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.72063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.72078: variable 'omit' from source: magic vars 30529 1726882665.72727: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.73174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882665.73425: variable 'network_provider' from source: set_fact 30529 1726882665.73434: variable 'network_state' from source: role '' defaults 30529 1726882665.73446: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882665.73454: variable 'omit' from source: magic vars 30529 1726882665.73630: variable 'omit' from source: magic vars 30529 1726882665.73664: variable 'network_service_name' from source: role '' defaults 30529 1726882665.73738: variable 'network_service_name' from source: role '' defaults 30529 1726882665.73961: variable '__network_provider_setup' from source: role '' defaults 30529 1726882665.74185: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882665.74188: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882665.74190: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882665.74339: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882665.74836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882665.77490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882665.77557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882665.77602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882665.77652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882665.77721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882665.77877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.78083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.78086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.78324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.78327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.78329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.78362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.78403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.78517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.78584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.79118: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882665.79443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.79517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.79580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.79698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.79702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.79990: variable 'ansible_python' from source: facts 30529 1726882665.80018: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882665.80212: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882665.80295: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882665.80599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.80603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.80606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.80761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.80818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.80874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882665.80951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882665.80979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.81023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882665.81038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882665.81191: variable 'network_connections' from source: include params 30529 1726882665.81207: variable 'interface' from source: play vars 30529 1726882665.81308: variable 'interface' from source: play vars 30529 1726882665.81418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882665.81700: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882665.81703: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882665.81705: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882665.81745: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882665.81825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882665.81858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882665.81892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882665.81932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882665.81978: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.82250: variable 'network_connections' from source: include params 30529 1726882665.82262: variable 'interface' from source: play vars 30529 1726882665.82335: variable 'interface' from source: play vars 30529 1726882665.82371: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882665.82445: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882665.82898: variable 'network_connections' from source: include params 30529 1726882665.82904: variable 'interface' from source: play vars 30529 1726882665.83099: variable 'interface' from source: play vars 30529 1726882665.83102: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882665.83300: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882665.83895: variable 'network_connections' from source: include params 30529 1726882665.83898: variable 'interface' from source: play vars 30529 1726882665.83900: variable 'interface' from source: play vars 30529 1726882665.84029: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882665.84088: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882665.84216: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882665.84266: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882665.84800: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882665.85529: variable 'network_connections' from source: include params 30529 1726882665.85543: variable 'interface' from source: play vars 30529 1726882665.85607: variable 'interface' from source: play vars 30529 1726882665.85620: variable 'ansible_distribution' from source: facts 30529 1726882665.85628: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.85638: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.85661: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882665.85841: variable 'ansible_distribution' from source: facts 30529 1726882665.85851: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.85860: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.85881: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882665.86054: variable 'ansible_distribution' from source: facts 30529 1726882665.86064: variable '__network_rh_distros' from source: role '' defaults 30529 1726882665.86073: variable 'ansible_distribution_major_version' from source: facts 30529 1726882665.86117: variable 'network_provider' from source: set_fact 30529 1726882665.86145: variable 'omit' from source: magic vars 30529 1726882665.86176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882665.86217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882665.86242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882665.86264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882665.86281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882665.86319: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882665.86327: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.86334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.86433: Set connection var ansible_shell_executable to /bin/sh 30529 1726882665.86442: Set connection var ansible_pipelining to False 30529 1726882665.86447: Set connection var ansible_shell_type to sh 30529 1726882665.86625: Set connection var ansible_timeout to 10 30529 1726882665.86629: Set connection var ansible_connection to ssh 30529 1726882665.86631: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882665.86634: variable 'ansible_shell_executable' from source: unknown 30529 1726882665.86636: variable 'ansible_connection' from source: unknown 30529 1726882665.86638: variable 'ansible_module_compression' from source: unknown 30529 1726882665.86640: variable 'ansible_shell_type' from source: unknown 30529 1726882665.86642: variable 'ansible_shell_executable' from source: unknown 30529 1726882665.86644: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882665.86645: variable 'ansible_pipelining' from source: unknown 30529 1726882665.86647: variable 'ansible_timeout' from source: unknown 30529 1726882665.86649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882665.86732: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882665.86765: variable 'omit' from source: magic vars 30529 1726882665.86855: starting attempt loop 30529 1726882665.86857: running the handler 30529 1726882665.86867: variable 'ansible_facts' from source: unknown 30529 1726882665.87725: _low_level_execute_command(): starting 30529 1726882665.87738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882665.88612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882665.88669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882665.88687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882665.88854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882665.89077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882665.90763: stdout chunk (state=3): >>>/root <<< 30529 1726882665.90991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882665.91008: stdout chunk (state=3): >>><<< 30529 1726882665.91026: stderr chunk (state=3): >>><<< 30529 1726882665.91053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882665.91141: _low_level_execute_command(): starting 30529 1726882665.91145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859 `" && echo ansible-tmp-1726882665.9105606-34337-140631941966859="` echo /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859 `" ) && sleep 0' 30529 1726882665.91686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882665.91707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882665.91810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882665.91827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882665.91964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882665.93929: stdout chunk (state=3): >>>ansible-tmp-1726882665.9105606-34337-140631941966859=/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859 <<< 30529 1726882665.93978: stdout chunk (state=3): >>><<< 30529 1726882665.94399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882665.94402: stderr chunk (state=3): >>><<< 30529 1726882665.94405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882665.9105606-34337-140631941966859=/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882665.94407: variable 'ansible_module_compression' from source: unknown 30529 1726882665.94409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882665.94411: variable 'ansible_facts' from source: unknown 30529 1726882665.94578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py 30529 1726882665.94790: Sending initial data 30529 1726882665.94802: Sent initial data (156 bytes) 30529 1726882665.95256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882665.95270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882665.95285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882665.95381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882665.95400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882665.95471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882665.97016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882665.97085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882665.97101: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py" <<< 30529 1726882665.97152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5504rrlm /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py <<< 30529 1726882665.97261: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5504rrlm" to remote "/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py" <<< 30529 1726882665.98786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882665.98804: stdout chunk (state=3): >>><<< 30529 1726882665.98955: stderr chunk (state=3): >>><<< 30529 1726882665.98958: done transferring module to remote 30529 1726882665.98960: _low_level_execute_command(): starting 30529 1726882665.98962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/ /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py && sleep 0' 30529 1726882665.99522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882665.99536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882665.99609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882665.99640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882665.99664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882665.99677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882665.99752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882666.01528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882666.01554: stdout chunk (state=3): >>><<< 30529 1726882666.01557: stderr chunk (state=3): >>><<< 30529 1726882666.01601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882666.01605: _low_level_execute_command(): starting 30529 1726882666.01607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/AnsiballZ_systemd.py && sleep 0' 30529 1726882666.02216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882666.02231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882666.02256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882666.02275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882666.02355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882666.02369: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.02423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882666.02441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882666.02469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882666.02551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882666.31317: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10899456", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316355072", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1873921000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882666.31336: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882666.31348: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882666.33098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882666.33102: stdout chunk (state=3): >>><<< 30529 1726882666.33299: stderr chunk (state=3): >>><<< 30529 1726882666.33304: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10899456", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316355072", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1873921000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882666.33340: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882666.33361: _low_level_execute_command(): starting 30529 1726882666.33366: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882665.9105606-34337-140631941966859/ > /dev/null 2>&1 && sleep 0' 30529 1726882666.33846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882666.33854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882666.33860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882666.33885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.33889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882666.33891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.33943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882666.33946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882666.33998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882666.35754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882666.35775: stderr chunk (state=3): >>><<< 30529 1726882666.35779: stdout chunk (state=3): >>><<< 30529 1726882666.35796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882666.35801: handler run complete 30529 1726882666.35839: attempt loop complete, returning result 30529 1726882666.35842: _execute() done 30529 1726882666.35845: dumping result to json 30529 1726882666.35856: done dumping result, returning 30529 1726882666.35864: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-0000000019cb] 30529 1726882666.35866: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cb 30529 1726882666.36196: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cb 30529 1726882666.36199: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882666.36252: no more pending results, returning what we have 30529 1726882666.36255: results queue empty 30529 1726882666.36256: checking for any_errors_fatal 30529 1726882666.36261: done checking for any_errors_fatal 30529 1726882666.36262: checking for max_fail_percentage 30529 1726882666.36264: done checking for max_fail_percentage 30529 1726882666.36264: checking to see if all hosts have failed and the running result is not ok 30529 1726882666.36265: done checking to see if all hosts have failed 30529 1726882666.36266: getting the remaining hosts for this loop 30529 1726882666.36268: done getting the remaining hosts for this loop 30529 1726882666.36271: getting the next task for host managed_node1 30529 1726882666.36279: done getting next task for host managed_node1 30529 1726882666.36282: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882666.36287: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882666.36301: getting variables 30529 1726882666.36303: in VariableManager get_vars() 30529 1726882666.36362: Calling all_inventory to load vars for managed_node1 30529 1726882666.36365: Calling groups_inventory to load vars for managed_node1 30529 1726882666.36368: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882666.36377: Calling all_plugins_play to load vars for managed_node1 30529 1726882666.36380: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882666.36382: Calling groups_plugins_play to load vars for managed_node1 30529 1726882666.37360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882666.38216: done with get_vars() 30529 1726882666.38233: done getting variables 30529 1726882666.38274: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:46 -0400 (0:00:00.684) 0:01:20.409 ****** 30529 1726882666.38307: entering _queue_task() for managed_node1/service 30529 1726882666.38532: worker is 1 (out of 1 available) 30529 1726882666.38546: exiting _queue_task() for managed_node1/service 30529 1726882666.38559: done queuing things up, now waiting for results queue to drain 30529 1726882666.38560: waiting for pending results... 30529 1726882666.38739: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882666.38834: in run() - task 12673a56-9f93-b0f1-edc0-0000000019cc 30529 1726882666.38845: variable 'ansible_search_path' from source: unknown 30529 1726882666.38848: variable 'ansible_search_path' from source: unknown 30529 1726882666.38875: calling self._execute() 30529 1726882666.38953: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.38956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.38965: variable 'omit' from source: magic vars 30529 1726882666.39238: variable 'ansible_distribution_major_version' from source: facts 30529 1726882666.39249: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882666.39329: variable 'network_provider' from source: set_fact 30529 1726882666.39333: Evaluated conditional (network_provider == "nm"): True 30529 1726882666.39399: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882666.39457: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882666.39568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882666.46701: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882666.46705: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882666.46708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882666.46710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882666.46712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882666.46788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882666.46826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882666.46855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882666.46902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882666.46923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882666.46973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882666.47000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882666.47031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882666.47075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882666.47101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882666.47144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882666.47171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882666.47200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882666.47240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882666.47259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882666.47399: variable 'network_connections' from source: include params 30529 1726882666.47415: variable 'interface' from source: play vars 30529 1726882666.47489: variable 'interface' from source: play vars 30529 1726882666.47562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882666.47721: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882666.47760: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882666.47795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882666.47826: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882666.47870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882666.47899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882666.47929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882666.47959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882666.48002: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882666.48233: variable 'network_connections' from source: include params 30529 1726882666.48239: variable 'interface' from source: play vars 30529 1726882666.48284: variable 'interface' from source: play vars 30529 1726882666.48311: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882666.48314: when evaluation is False, skipping this task 30529 1726882666.48317: _execute() done 30529 1726882666.48320: dumping result to json 30529 1726882666.48322: done dumping result, returning 30529 1726882666.48332: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-0000000019cc] 30529 1726882666.48343: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cc 30529 1726882666.48428: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cc 30529 1726882666.48432: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882666.48470: no more pending results, returning what we have 30529 1726882666.48473: results queue empty 30529 1726882666.48474: checking for any_errors_fatal 30529 1726882666.48490: done checking for any_errors_fatal 30529 1726882666.48490: checking for max_fail_percentage 30529 1726882666.48494: done checking for max_fail_percentage 30529 1726882666.48495: checking to see if all hosts have failed and the running result is not ok 30529 1726882666.48496: done checking to see if all hosts have failed 30529 1726882666.48496: getting the remaining hosts for this loop 30529 1726882666.48498: done getting the remaining hosts for this loop 30529 1726882666.48501: getting the next task for host managed_node1 30529 1726882666.48508: done getting next task for host managed_node1 30529 1726882666.48512: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882666.48517: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882666.48537: getting variables 30529 1726882666.48539: in VariableManager get_vars() 30529 1726882666.48575: Calling all_inventory to load vars for managed_node1 30529 1726882666.48577: Calling groups_inventory to load vars for managed_node1 30529 1726882666.48579: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882666.48588: Calling all_plugins_play to load vars for managed_node1 30529 1726882666.48590: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882666.48594: Calling groups_plugins_play to load vars for managed_node1 30529 1726882666.54832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882666.56336: done with get_vars() 30529 1726882666.56359: done getting variables 30529 1726882666.56404: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:46 -0400 (0:00:00.181) 0:01:20.590 ****** 30529 1726882666.56432: entering _queue_task() for managed_node1/service 30529 1726882666.57020: worker is 1 (out of 1 available) 30529 1726882666.57028: exiting _queue_task() for managed_node1/service 30529 1726882666.57038: done queuing things up, now waiting for results queue to drain 30529 1726882666.57040: waiting for pending results... 30529 1726882666.57171: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882666.57375: in run() - task 12673a56-9f93-b0f1-edc0-0000000019cd 30529 1726882666.57379: variable 'ansible_search_path' from source: unknown 30529 1726882666.57383: variable 'ansible_search_path' from source: unknown 30529 1726882666.57387: calling self._execute() 30529 1726882666.57455: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.57466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.57485: variable 'omit' from source: magic vars 30529 1726882666.57873: variable 'ansible_distribution_major_version' from source: facts 30529 1726882666.57891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882666.58020: variable 'network_provider' from source: set_fact 30529 1726882666.58038: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882666.58046: when evaluation is False, skipping this task 30529 1726882666.58052: _execute() done 30529 1726882666.58059: dumping result to json 30529 1726882666.58065: done dumping result, returning 30529 1726882666.58076: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-0000000019cd] 30529 1726882666.58140: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cd 30529 1726882666.58215: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cd 30529 1726882666.58218: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882666.58284: no more pending results, returning what we have 30529 1726882666.58288: results queue empty 30529 1726882666.58289: checking for any_errors_fatal 30529 1726882666.58303: done checking for any_errors_fatal 30529 1726882666.58304: checking for max_fail_percentage 30529 1726882666.58306: done checking for max_fail_percentage 30529 1726882666.58307: checking to see if all hosts have failed and the running result is not ok 30529 1726882666.58308: done checking to see if all hosts have failed 30529 1726882666.58308: getting the remaining hosts for this loop 30529 1726882666.58310: done getting the remaining hosts for this loop 30529 1726882666.58314: getting the next task for host managed_node1 30529 1726882666.58324: done getting next task for host managed_node1 30529 1726882666.58328: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882666.58335: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882666.58362: getting variables 30529 1726882666.58364: in VariableManager get_vars() 30529 1726882666.58405: Calling all_inventory to load vars for managed_node1 30529 1726882666.58408: Calling groups_inventory to load vars for managed_node1 30529 1726882666.58411: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882666.58424: Calling all_plugins_play to load vars for managed_node1 30529 1726882666.58427: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882666.58430: Calling groups_plugins_play to load vars for managed_node1 30529 1726882666.59992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882666.61523: done with get_vars() 30529 1726882666.61544: done getting variables 30529 1726882666.61605: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:46 -0400 (0:00:00.052) 0:01:20.642 ****** 30529 1726882666.61643: entering _queue_task() for managed_node1/copy 30529 1726882666.61958: worker is 1 (out of 1 available) 30529 1726882666.61973: exiting _queue_task() for managed_node1/copy 30529 1726882666.61986: done queuing things up, now waiting for results queue to drain 30529 1726882666.61987: waiting for pending results... 30529 1726882666.62415: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882666.62466: in run() - task 12673a56-9f93-b0f1-edc0-0000000019ce 30529 1726882666.62487: variable 'ansible_search_path' from source: unknown 30529 1726882666.62498: variable 'ansible_search_path' from source: unknown 30529 1726882666.62543: calling self._execute() 30529 1726882666.62645: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.62658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.62678: variable 'omit' from source: magic vars 30529 1726882666.63073: variable 'ansible_distribution_major_version' from source: facts 30529 1726882666.63098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882666.63299: variable 'network_provider' from source: set_fact 30529 1726882666.63303: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882666.63306: when evaluation is False, skipping this task 30529 1726882666.63309: _execute() done 30529 1726882666.63311: dumping result to json 30529 1726882666.63314: done dumping result, returning 30529 1726882666.63317: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-0000000019ce] 30529 1726882666.63320: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019ce skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882666.63441: no more pending results, returning what we have 30529 1726882666.63446: results queue empty 30529 1726882666.63447: checking for any_errors_fatal 30529 1726882666.63453: done checking for any_errors_fatal 30529 1726882666.63454: checking for max_fail_percentage 30529 1726882666.63455: done checking for max_fail_percentage 30529 1726882666.63456: checking to see if all hosts have failed and the running result is not ok 30529 1726882666.63457: done checking to see if all hosts have failed 30529 1726882666.63458: getting the remaining hosts for this loop 30529 1726882666.63459: done getting the remaining hosts for this loop 30529 1726882666.63463: getting the next task for host managed_node1 30529 1726882666.63471: done getting next task for host managed_node1 30529 1726882666.63475: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882666.63480: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882666.63506: getting variables 30529 1726882666.63509: in VariableManager get_vars() 30529 1726882666.63551: Calling all_inventory to load vars for managed_node1 30529 1726882666.63554: Calling groups_inventory to load vars for managed_node1 30529 1726882666.63556: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882666.63569: Calling all_plugins_play to load vars for managed_node1 30529 1726882666.63573: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882666.63577: Calling groups_plugins_play to load vars for managed_node1 30529 1726882666.64380: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019ce 30529 1726882666.64384: WORKER PROCESS EXITING 30529 1726882666.65359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882666.67007: done with get_vars() 30529 1726882666.67028: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:46 -0400 (0:00:00.054) 0:01:20.697 ****** 30529 1726882666.67119: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882666.67418: worker is 1 (out of 1 available) 30529 1726882666.67429: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882666.67441: done queuing things up, now waiting for results queue to drain 30529 1726882666.67443: waiting for pending results... 30529 1726882666.67735: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882666.67878: in run() - task 12673a56-9f93-b0f1-edc0-0000000019cf 30529 1726882666.67904: variable 'ansible_search_path' from source: unknown 30529 1726882666.67917: variable 'ansible_search_path' from source: unknown 30529 1726882666.67953: calling self._execute() 30529 1726882666.68063: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.68076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.68135: variable 'omit' from source: magic vars 30529 1726882666.68800: variable 'ansible_distribution_major_version' from source: facts 30529 1726882666.68803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882666.68807: variable 'omit' from source: magic vars 30529 1726882666.68811: variable 'omit' from source: magic vars 30529 1726882666.68886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882666.72711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882666.72975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882666.73014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882666.73046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882666.73298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882666.73302: variable 'network_provider' from source: set_fact 30529 1726882666.73500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882666.73643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882666.73667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882666.73812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882666.73826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882666.73902: variable 'omit' from source: magic vars 30529 1726882666.74194: variable 'omit' from source: magic vars 30529 1726882666.74410: variable 'network_connections' from source: include params 30529 1726882666.74421: variable 'interface' from source: play vars 30529 1726882666.74481: variable 'interface' from source: play vars 30529 1726882666.74844: variable 'omit' from source: magic vars 30529 1726882666.74852: variable '__lsr_ansible_managed' from source: task vars 30529 1726882666.75010: variable '__lsr_ansible_managed' from source: task vars 30529 1726882666.75283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882666.75808: Loaded config def from plugin (lookup/template) 30529 1726882666.75811: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882666.75840: File lookup term: get_ansible_managed.j2 30529 1726882666.75843: variable 'ansible_search_path' from source: unknown 30529 1726882666.75847: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882666.75860: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882666.75876: variable 'ansible_search_path' from source: unknown 30529 1726882666.89587: variable 'ansible_managed' from source: unknown 30529 1726882666.89828: variable 'omit' from source: magic vars 30529 1726882666.89852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882666.89878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882666.90015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882666.90032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882666.90043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882666.90070: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882666.90073: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.90076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.90291: Set connection var ansible_shell_executable to /bin/sh 30529 1726882666.90296: Set connection var ansible_pipelining to False 30529 1726882666.90498: Set connection var ansible_shell_type to sh 30529 1726882666.90501: Set connection var ansible_timeout to 10 30529 1726882666.90504: Set connection var ansible_connection to ssh 30529 1726882666.90506: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882666.90508: variable 'ansible_shell_executable' from source: unknown 30529 1726882666.90511: variable 'ansible_connection' from source: unknown 30529 1726882666.90513: variable 'ansible_module_compression' from source: unknown 30529 1726882666.90515: variable 'ansible_shell_type' from source: unknown 30529 1726882666.90517: variable 'ansible_shell_executable' from source: unknown 30529 1726882666.90520: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882666.90522: variable 'ansible_pipelining' from source: unknown 30529 1726882666.90525: variable 'ansible_timeout' from source: unknown 30529 1726882666.90527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882666.90773: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882666.90784: variable 'omit' from source: magic vars 30529 1726882666.90790: starting attempt loop 30529 1726882666.90795: running the handler 30529 1726882666.90807: _low_level_execute_command(): starting 30529 1726882666.90815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882666.92298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.92304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882666.92338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882666.92424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882666.94142: stdout chunk (state=3): >>>/root <<< 30529 1726882666.94237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882666.94404: stderr chunk (state=3): >>><<< 30529 1726882666.94407: stdout chunk (state=3): >>><<< 30529 1726882666.94432: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882666.94444: _low_level_execute_command(): starting 30529 1726882666.94499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814 `" && echo ansible-tmp-1726882666.9443202-34397-179531957282814="` echo /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814 `" ) && sleep 0' 30529 1726882666.95608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.95743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882666.95808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882666.95924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882666.97795: stdout chunk (state=3): >>>ansible-tmp-1726882666.9443202-34397-179531957282814=/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814 <<< 30529 1726882666.97933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882666.97936: stdout chunk (state=3): >>><<< 30529 1726882666.97939: stderr chunk (state=3): >>><<< 30529 1726882666.98043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882666.9443202-34397-179531957282814=/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882666.98048: variable 'ansible_module_compression' from source: unknown 30529 1726882666.98050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882666.98191: variable 'ansible_facts' from source: unknown 30529 1726882666.98372: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py 30529 1726882666.98627: Sending initial data 30529 1726882666.98630: Sent initial data (168 bytes) 30529 1726882666.99963: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882666.99966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.00135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.00253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.00312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.01837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882667.01945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882667.01985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpide23uub /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py <<< 30529 1726882667.02009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py" <<< 30529 1726882667.02144: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpide23uub" to remote "/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py" <<< 30529 1726882667.03926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.04017: stderr chunk (state=3): >>><<< 30529 1726882667.04211: stdout chunk (state=3): >>><<< 30529 1726882667.04232: done transferring module to remote 30529 1726882667.04242: _low_level_execute_command(): starting 30529 1726882667.04247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/ /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py && sleep 0' 30529 1726882667.05130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.05133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882667.05136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.05138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882667.05140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882667.05142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.05222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.05261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.07400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.07404: stdout chunk (state=3): >>><<< 30529 1726882667.07406: stderr chunk (state=3): >>><<< 30529 1726882667.07409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.07411: _low_level_execute_command(): starting 30529 1726882667.07413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/AnsiballZ_network_connections.py && sleep 0' 30529 1726882667.08091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.08144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.32677: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882667.34392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882667.34399: stdout chunk (state=3): >>><<< 30529 1726882667.34402: stderr chunk (state=3): >>><<< 30529 1726882667.34404: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882667.34407: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882667.34409: _low_level_execute_command(): starting 30529 1726882667.34411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882666.9443202-34397-179531957282814/ > /dev/null 2>&1 && sleep 0' 30529 1726882667.35059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882667.35084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.35202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.35225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.35321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.37198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.37202: stdout chunk (state=3): >>><<< 30529 1726882667.37204: stderr chunk (state=3): >>><<< 30529 1726882667.37207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.37209: handler run complete 30529 1726882667.37212: attempt loop complete, returning result 30529 1726882667.37214: _execute() done 30529 1726882667.37217: dumping result to json 30529 1726882667.37219: done dumping result, returning 30529 1726882667.37230: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-0000000019cf] 30529 1726882667.37233: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cf 30529 1726882667.37347: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019cf 30529 1726882667.37349: WORKER PROCESS EXITING ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active 30529 1726882667.37443: no more pending results, returning what we have 30529 1726882667.37446: results queue empty 30529 1726882667.37447: checking for any_errors_fatal 30529 1726882667.37454: done checking for any_errors_fatal 30529 1726882667.37454: checking for max_fail_percentage 30529 1726882667.37456: done checking for max_fail_percentage 30529 1726882667.37457: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.37458: done checking to see if all hosts have failed 30529 1726882667.37463: getting the remaining hosts for this loop 30529 1726882667.37465: done getting the remaining hosts for this loop 30529 1726882667.37468: getting the next task for host managed_node1 30529 1726882667.37476: done getting next task for host managed_node1 30529 1726882667.37479: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882667.37483: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.37498: getting variables 30529 1726882667.37500: in VariableManager get_vars() 30529 1726882667.37539: Calling all_inventory to load vars for managed_node1 30529 1726882667.37541: Calling groups_inventory to load vars for managed_node1 30529 1726882667.37544: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.37553: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.37556: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.37558: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.39128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.41001: done with get_vars() 30529 1726882667.41025: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:47 -0400 (0:00:00.740) 0:01:21.437 ****** 30529 1726882667.41125: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882667.41550: worker is 1 (out of 1 available) 30529 1726882667.41561: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882667.41575: done queuing things up, now waiting for results queue to drain 30529 1726882667.41577: waiting for pending results... 30529 1726882667.42214: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882667.42219: in run() - task 12673a56-9f93-b0f1-edc0-0000000019d0 30529 1726882667.42222: variable 'ansible_search_path' from source: unknown 30529 1726882667.42225: variable 'ansible_search_path' from source: unknown 30529 1726882667.42228: calling self._execute() 30529 1726882667.42284: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.42288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.42304: variable 'omit' from source: magic vars 30529 1726882667.42736: variable 'ansible_distribution_major_version' from source: facts 30529 1726882667.42748: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882667.42885: variable 'network_state' from source: role '' defaults 30529 1726882667.42897: Evaluated conditional (network_state != {}): False 30529 1726882667.42900: when evaluation is False, skipping this task 30529 1726882667.42903: _execute() done 30529 1726882667.42917: dumping result to json 30529 1726882667.42920: done dumping result, returning 30529 1726882667.42928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-0000000019d0] 30529 1726882667.42933: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d0 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882667.43079: no more pending results, returning what we have 30529 1726882667.43084: results queue empty 30529 1726882667.43085: checking for any_errors_fatal 30529 1726882667.43104: done checking for any_errors_fatal 30529 1726882667.43105: checking for max_fail_percentage 30529 1726882667.43107: done checking for max_fail_percentage 30529 1726882667.43108: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.43109: done checking to see if all hosts have failed 30529 1726882667.43110: getting the remaining hosts for this loop 30529 1726882667.43111: done getting the remaining hosts for this loop 30529 1726882667.43116: getting the next task for host managed_node1 30529 1726882667.43127: done getting next task for host managed_node1 30529 1726882667.43132: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882667.43138: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.43157: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d0 30529 1726882667.43160: WORKER PROCESS EXITING 30529 1726882667.43319: getting variables 30529 1726882667.43321: in VariableManager get_vars() 30529 1726882667.43365: Calling all_inventory to load vars for managed_node1 30529 1726882667.43368: Calling groups_inventory to load vars for managed_node1 30529 1726882667.43370: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.43382: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.43386: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.43392: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.45009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.46737: done with get_vars() 30529 1726882667.46773: done getting variables 30529 1726882667.46838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:47 -0400 (0:00:00.057) 0:01:21.494 ****** 30529 1726882667.46884: entering _queue_task() for managed_node1/debug 30529 1726882667.47287: worker is 1 (out of 1 available) 30529 1726882667.47407: exiting _queue_task() for managed_node1/debug 30529 1726882667.47418: done queuing things up, now waiting for results queue to drain 30529 1726882667.47420: waiting for pending results... 30529 1726882667.47639: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882667.48000: in run() - task 12673a56-9f93-b0f1-edc0-0000000019d1 30529 1726882667.48004: variable 'ansible_search_path' from source: unknown 30529 1726882667.48007: variable 'ansible_search_path' from source: unknown 30529 1726882667.48010: calling self._execute() 30529 1726882667.48013: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.48016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.48019: variable 'omit' from source: magic vars 30529 1726882667.48440: variable 'ansible_distribution_major_version' from source: facts 30529 1726882667.48451: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882667.48459: variable 'omit' from source: magic vars 30529 1726882667.48535: variable 'omit' from source: magic vars 30529 1726882667.48567: variable 'omit' from source: magic vars 30529 1726882667.48619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882667.48654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882667.48673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882667.48690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.48715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.48743: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882667.48747: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.48750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.48862: Set connection var ansible_shell_executable to /bin/sh 30529 1726882667.48865: Set connection var ansible_pipelining to False 30529 1726882667.48868: Set connection var ansible_shell_type to sh 30529 1726882667.48878: Set connection var ansible_timeout to 10 30529 1726882667.48881: Set connection var ansible_connection to ssh 30529 1726882667.48886: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882667.48914: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.48917: variable 'ansible_connection' from source: unknown 30529 1726882667.48928: variable 'ansible_module_compression' from source: unknown 30529 1726882667.48931: variable 'ansible_shell_type' from source: unknown 30529 1726882667.48933: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.48937: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.48941: variable 'ansible_pipelining' from source: unknown 30529 1726882667.48944: variable 'ansible_timeout' from source: unknown 30529 1726882667.48948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.49092: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882667.49106: variable 'omit' from source: magic vars 30529 1726882667.49112: starting attempt loop 30529 1726882667.49115: running the handler 30529 1726882667.49298: variable '__network_connections_result' from source: set_fact 30529 1726882667.49318: handler run complete 30529 1726882667.49335: attempt loop complete, returning result 30529 1726882667.49338: _execute() done 30529 1726882667.49341: dumping result to json 30529 1726882667.49343: done dumping result, returning 30529 1726882667.49353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000019d1] 30529 1726882667.49368: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d1 30529 1726882667.49558: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d1 30529 1726882667.49561: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active" ] } 30529 1726882667.49646: no more pending results, returning what we have 30529 1726882667.49650: results queue empty 30529 1726882667.49651: checking for any_errors_fatal 30529 1726882667.49657: done checking for any_errors_fatal 30529 1726882667.49658: checking for max_fail_percentage 30529 1726882667.49660: done checking for max_fail_percentage 30529 1726882667.49661: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.49662: done checking to see if all hosts have failed 30529 1726882667.49663: getting the remaining hosts for this loop 30529 1726882667.49665: done getting the remaining hosts for this loop 30529 1726882667.49669: getting the next task for host managed_node1 30529 1726882667.49678: done getting next task for host managed_node1 30529 1726882667.49682: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882667.49690: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.49706: getting variables 30529 1726882667.49708: in VariableManager get_vars() 30529 1726882667.49750: Calling all_inventory to load vars for managed_node1 30529 1726882667.49752: Calling groups_inventory to load vars for managed_node1 30529 1726882667.49755: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.49766: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.49769: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.49772: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.51608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.53423: done with get_vars() 30529 1726882667.53457: done getting variables 30529 1726882667.53530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:47 -0400 (0:00:00.066) 0:01:21.561 ****** 30529 1726882667.53576: entering _queue_task() for managed_node1/debug 30529 1726882667.53981: worker is 1 (out of 1 available) 30529 1726882667.54100: exiting _queue_task() for managed_node1/debug 30529 1726882667.54112: done queuing things up, now waiting for results queue to drain 30529 1726882667.54114: waiting for pending results... 30529 1726882667.54356: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882667.54587: in run() - task 12673a56-9f93-b0f1-edc0-0000000019d2 30529 1726882667.54591: variable 'ansible_search_path' from source: unknown 30529 1726882667.54595: variable 'ansible_search_path' from source: unknown 30529 1726882667.54598: calling self._execute() 30529 1726882667.54706: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.54710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.54720: variable 'omit' from source: magic vars 30529 1726882667.55160: variable 'ansible_distribution_major_version' from source: facts 30529 1726882667.55172: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882667.55178: variable 'omit' from source: magic vars 30529 1726882667.55262: variable 'omit' from source: magic vars 30529 1726882667.55341: variable 'omit' from source: magic vars 30529 1726882667.55351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882667.55390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882667.55416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882667.55433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.55498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.55501: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882667.55504: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.55506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.55617: Set connection var ansible_shell_executable to /bin/sh 30529 1726882667.55621: Set connection var ansible_pipelining to False 30529 1726882667.55624: Set connection var ansible_shell_type to sh 30529 1726882667.55634: Set connection var ansible_timeout to 10 30529 1726882667.55637: Set connection var ansible_connection to ssh 30529 1726882667.55641: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882667.55673: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.55676: variable 'ansible_connection' from source: unknown 30529 1726882667.55678: variable 'ansible_module_compression' from source: unknown 30529 1726882667.55681: variable 'ansible_shell_type' from source: unknown 30529 1726882667.55683: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.55685: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.55772: variable 'ansible_pipelining' from source: unknown 30529 1726882667.55775: variable 'ansible_timeout' from source: unknown 30529 1726882667.55777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.55855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882667.55866: variable 'omit' from source: magic vars 30529 1726882667.55871: starting attempt loop 30529 1726882667.55874: running the handler 30529 1726882667.55938: variable '__network_connections_result' from source: set_fact 30529 1726882667.56026: variable '__network_connections_result' from source: set_fact 30529 1726882667.56127: handler run complete 30529 1726882667.56149: attempt loop complete, returning result 30529 1726882667.56152: _execute() done 30529 1726882667.56155: dumping result to json 30529 1726882667.56157: done dumping result, returning 30529 1726882667.56198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000019d2] 30529 1726882667.56202: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d2 30529 1726882667.56275: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d2 30529 1726882667.56279: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 20e24cfc-e38f-4d09-8124-2176ed3997b7 skipped because already active" ] } } 30529 1726882667.56372: no more pending results, returning what we have 30529 1726882667.56376: results queue empty 30529 1726882667.56377: checking for any_errors_fatal 30529 1726882667.56385: done checking for any_errors_fatal 30529 1726882667.56386: checking for max_fail_percentage 30529 1726882667.56391: done checking for max_fail_percentage 30529 1726882667.56392: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.56497: done checking to see if all hosts have failed 30529 1726882667.56499: getting the remaining hosts for this loop 30529 1726882667.56501: done getting the remaining hosts for this loop 30529 1726882667.56506: getting the next task for host managed_node1 30529 1726882667.56515: done getting next task for host managed_node1 30529 1726882667.56519: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882667.56524: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.56540: getting variables 30529 1726882667.56542: in VariableManager get_vars() 30529 1726882667.56582: Calling all_inventory to load vars for managed_node1 30529 1726882667.56584: Calling groups_inventory to load vars for managed_node1 30529 1726882667.56711: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.56721: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.56724: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.56727: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.58213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.59853: done with get_vars() 30529 1726882667.59879: done getting variables 30529 1726882667.59942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:47 -0400 (0:00:00.064) 0:01:21.625 ****** 30529 1726882667.59984: entering _queue_task() for managed_node1/debug 30529 1726882667.60378: worker is 1 (out of 1 available) 30529 1726882667.60603: exiting _queue_task() for managed_node1/debug 30529 1726882667.60614: done queuing things up, now waiting for results queue to drain 30529 1726882667.60616: waiting for pending results... 30529 1726882667.60809: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882667.60885: in run() - task 12673a56-9f93-b0f1-edc0-0000000019d3 30529 1726882667.60901: variable 'ansible_search_path' from source: unknown 30529 1726882667.60905: variable 'ansible_search_path' from source: unknown 30529 1726882667.60933: calling self._execute() 30529 1726882667.61023: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.61027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.61035: variable 'omit' from source: magic vars 30529 1726882667.61333: variable 'ansible_distribution_major_version' from source: facts 30529 1726882667.61342: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882667.61431: variable 'network_state' from source: role '' defaults 30529 1726882667.61439: Evaluated conditional (network_state != {}): False 30529 1726882667.61442: when evaluation is False, skipping this task 30529 1726882667.61444: _execute() done 30529 1726882667.61447: dumping result to json 30529 1726882667.61449: done dumping result, returning 30529 1726882667.61457: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-0000000019d3] 30529 1726882667.61461: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d3 30529 1726882667.61549: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d3 30529 1726882667.61551: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882667.61597: no more pending results, returning what we have 30529 1726882667.61601: results queue empty 30529 1726882667.61602: checking for any_errors_fatal 30529 1726882667.61611: done checking for any_errors_fatal 30529 1726882667.61612: checking for max_fail_percentage 30529 1726882667.61614: done checking for max_fail_percentage 30529 1726882667.61615: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.61616: done checking to see if all hosts have failed 30529 1726882667.61617: getting the remaining hosts for this loop 30529 1726882667.61618: done getting the remaining hosts for this loop 30529 1726882667.61622: getting the next task for host managed_node1 30529 1726882667.61631: done getting next task for host managed_node1 30529 1726882667.61634: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882667.61640: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.61668: getting variables 30529 1726882667.61670: in VariableManager get_vars() 30529 1726882667.61711: Calling all_inventory to load vars for managed_node1 30529 1726882667.61713: Calling groups_inventory to load vars for managed_node1 30529 1726882667.61715: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.61725: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.61728: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.61730: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.62707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.63929: done with get_vars() 30529 1726882667.63944: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:47 -0400 (0:00:00.040) 0:01:21.666 ****** 30529 1726882667.64016: entering _queue_task() for managed_node1/ping 30529 1726882667.64253: worker is 1 (out of 1 available) 30529 1726882667.64266: exiting _queue_task() for managed_node1/ping 30529 1726882667.64278: done queuing things up, now waiting for results queue to drain 30529 1726882667.64281: waiting for pending results... 30529 1726882667.64466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882667.64575: in run() - task 12673a56-9f93-b0f1-edc0-0000000019d4 30529 1726882667.64587: variable 'ansible_search_path' from source: unknown 30529 1726882667.64590: variable 'ansible_search_path' from source: unknown 30529 1726882667.64623: calling self._execute() 30529 1726882667.64698: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.64702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.64711: variable 'omit' from source: magic vars 30529 1726882667.65001: variable 'ansible_distribution_major_version' from source: facts 30529 1726882667.65012: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882667.65018: variable 'omit' from source: magic vars 30529 1726882667.65064: variable 'omit' from source: magic vars 30529 1726882667.65087: variable 'omit' from source: magic vars 30529 1726882667.65121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882667.65148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882667.65167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882667.65180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.65194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882667.65218: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882667.65221: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.65224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.65297: Set connection var ansible_shell_executable to /bin/sh 30529 1726882667.65301: Set connection var ansible_pipelining to False 30529 1726882667.65303: Set connection var ansible_shell_type to sh 30529 1726882667.65311: Set connection var ansible_timeout to 10 30529 1726882667.65314: Set connection var ansible_connection to ssh 30529 1726882667.65318: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882667.65335: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.65337: variable 'ansible_connection' from source: unknown 30529 1726882667.65340: variable 'ansible_module_compression' from source: unknown 30529 1726882667.65343: variable 'ansible_shell_type' from source: unknown 30529 1726882667.65345: variable 'ansible_shell_executable' from source: unknown 30529 1726882667.65347: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882667.65351: variable 'ansible_pipelining' from source: unknown 30529 1726882667.65353: variable 'ansible_timeout' from source: unknown 30529 1726882667.65357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882667.65566: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882667.65571: variable 'omit' from source: magic vars 30529 1726882667.65573: starting attempt loop 30529 1726882667.65576: running the handler 30529 1726882667.65578: _low_level_execute_command(): starting 30529 1726882667.65580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882667.66299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882667.66303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.66305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.66308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882667.66319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882667.66399: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882667.66402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.66405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882667.66408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882667.66410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882667.66412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.66415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.66418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882667.66420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882667.66422: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882667.66424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.66547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882667.66550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.66553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.66602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.68232: stdout chunk (state=3): >>>/root <<< 30529 1726882667.68386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.68395: stdout chunk (state=3): >>><<< 30529 1726882667.68398: stderr chunk (state=3): >>><<< 30529 1726882667.68506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.68510: _low_level_execute_command(): starting 30529 1726882667.68513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827 `" && echo ansible-tmp-1726882667.6842134-34441-243403678145827="` echo /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827 `" ) && sleep 0' 30529 1726882667.69098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882667.69113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.69128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.69170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882667.69185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882667.69275: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.69309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.69382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.71234: stdout chunk (state=3): >>>ansible-tmp-1726882667.6842134-34441-243403678145827=/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827 <<< 30529 1726882667.71415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.71419: stdout chunk (state=3): >>><<< 30529 1726882667.71422: stderr chunk (state=3): >>><<< 30529 1726882667.71441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882667.6842134-34441-243403678145827=/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.71501: variable 'ansible_module_compression' from source: unknown 30529 1726882667.71598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882667.71601: variable 'ansible_facts' from source: unknown 30529 1726882667.71697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py 30529 1726882667.71940: Sending initial data 30529 1726882667.71957: Sent initial data (153 bytes) 30529 1726882667.72522: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.72611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.72649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882667.72671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.72682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.72757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.74312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882667.74382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882667.74425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvic5zafu /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py <<< 30529 1726882667.74428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py" <<< 30529 1726882667.74505: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvic5zafu" to remote "/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py" <<< 30529 1726882667.75290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.75304: stderr chunk (state=3): >>><<< 30529 1726882667.75311: stdout chunk (state=3): >>><<< 30529 1726882667.75347: done transferring module to remote 30529 1726882667.75356: _low_level_execute_command(): starting 30529 1726882667.75361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/ /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py && sleep 0' 30529 1726882667.75775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.75805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882667.75812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.75815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882667.75817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.75861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882667.75873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.75876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.75912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.77714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.77718: stdout chunk (state=3): >>><<< 30529 1726882667.77720: stderr chunk (state=3): >>><<< 30529 1726882667.77823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.77826: _low_level_execute_command(): starting 30529 1726882667.77832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/AnsiballZ_ping.py && sleep 0' 30529 1726882667.78264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.78281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882667.78309: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.78358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882667.78365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.78367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.78416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.93234: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882667.94490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882667.94497: stdout chunk (state=3): >>><<< 30529 1726882667.94500: stderr chunk (state=3): >>><<< 30529 1726882667.94521: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882667.94541: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882667.94549: _low_level_execute_command(): starting 30529 1726882667.94554: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882667.6842134-34441-243403678145827/ > /dev/null 2>&1 && sleep 0' 30529 1726882667.95002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882667.95005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.95008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882667.95010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882667.95012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882667.95056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882667.95059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882667.95063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882667.95111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882667.96887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882667.96914: stderr chunk (state=3): >>><<< 30529 1726882667.96917: stdout chunk (state=3): >>><<< 30529 1726882667.96935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882667.96941: handler run complete 30529 1726882667.96953: attempt loop complete, returning result 30529 1726882667.96956: _execute() done 30529 1726882667.96958: dumping result to json 30529 1726882667.96960: done dumping result, returning 30529 1726882667.96969: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-0000000019d4] 30529 1726882667.96973: sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d4 30529 1726882667.97061: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000019d4 30529 1726882667.97064: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882667.97131: no more pending results, returning what we have 30529 1726882667.97134: results queue empty 30529 1726882667.97135: checking for any_errors_fatal 30529 1726882667.97143: done checking for any_errors_fatal 30529 1726882667.97143: checking for max_fail_percentage 30529 1726882667.97145: done checking for max_fail_percentage 30529 1726882667.97146: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.97147: done checking to see if all hosts have failed 30529 1726882667.97148: getting the remaining hosts for this loop 30529 1726882667.97149: done getting the remaining hosts for this loop 30529 1726882667.97153: getting the next task for host managed_node1 30529 1726882667.97164: done getting next task for host managed_node1 30529 1726882667.97166: ^ task is: TASK: meta (role_complete) 30529 1726882667.97171: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.97183: getting variables 30529 1726882667.97185: in VariableManager get_vars() 30529 1726882667.97233: Calling all_inventory to load vars for managed_node1 30529 1726882667.97236: Calling groups_inventory to load vars for managed_node1 30529 1726882667.97238: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.97247: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.97250: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.97252: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.98096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882667.99116: done with get_vars() 30529 1726882667.99133: done getting variables 30529 1726882667.99199: done queuing things up, now waiting for results queue to drain 30529 1726882667.99201: results queue empty 30529 1726882667.99201: checking for any_errors_fatal 30529 1726882667.99203: done checking for any_errors_fatal 30529 1726882667.99204: checking for max_fail_percentage 30529 1726882667.99204: done checking for max_fail_percentage 30529 1726882667.99205: checking to see if all hosts have failed and the running result is not ok 30529 1726882667.99205: done checking to see if all hosts have failed 30529 1726882667.99206: getting the remaining hosts for this loop 30529 1726882667.99206: done getting the remaining hosts for this loop 30529 1726882667.99208: getting the next task for host managed_node1 30529 1726882667.99213: done getting next task for host managed_node1 30529 1726882667.99215: ^ task is: TASK: Include network role 30529 1726882667.99217: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882667.99219: getting variables 30529 1726882667.99220: in VariableManager get_vars() 30529 1726882667.99228: Calling all_inventory to load vars for managed_node1 30529 1726882667.99229: Calling groups_inventory to load vars for managed_node1 30529 1726882667.99231: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882667.99234: Calling all_plugins_play to load vars for managed_node1 30529 1726882667.99236: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882667.99237: Calling groups_plugins_play to load vars for managed_node1 30529 1726882667.99878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.00743: done with get_vars() 30529 1726882668.00757: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:37:48 -0400 (0:00:00.368) 0:01:22.034 ****** 30529 1726882668.00822: entering _queue_task() for managed_node1/include_role 30529 1726882668.01101: worker is 1 (out of 1 available) 30529 1726882668.01113: exiting _queue_task() for managed_node1/include_role 30529 1726882668.01125: done queuing things up, now waiting for results queue to drain 30529 1726882668.01127: waiting for pending results... 30529 1726882668.01320: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882668.01416: in run() - task 12673a56-9f93-b0f1-edc0-0000000017d9 30529 1726882668.01427: variable 'ansible_search_path' from source: unknown 30529 1726882668.01431: variable 'ansible_search_path' from source: unknown 30529 1726882668.01465: calling self._execute() 30529 1726882668.01543: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.01547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.01556: variable 'omit' from source: magic vars 30529 1726882668.01851: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.01860: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.01867: _execute() done 30529 1726882668.01870: dumping result to json 30529 1726882668.01873: done dumping result, returning 30529 1726882668.01880: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-0000000017d9] 30529 1726882668.01885: sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d9 30529 1726882668.01992: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000017d9 30529 1726882668.01997: WORKER PROCESS EXITING 30529 1726882668.02029: no more pending results, returning what we have 30529 1726882668.02033: in VariableManager get_vars() 30529 1726882668.02076: Calling all_inventory to load vars for managed_node1 30529 1726882668.02078: Calling groups_inventory to load vars for managed_node1 30529 1726882668.02081: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.02097: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.02101: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.02104: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.03080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.03937: done with get_vars() 30529 1726882668.03952: variable 'ansible_search_path' from source: unknown 30529 1726882668.03953: variable 'ansible_search_path' from source: unknown 30529 1726882668.04049: variable 'omit' from source: magic vars 30529 1726882668.04075: variable 'omit' from source: magic vars 30529 1726882668.04084: variable 'omit' from source: magic vars 30529 1726882668.04086: we have included files to process 30529 1726882668.04089: generating all_blocks data 30529 1726882668.04091: done generating all_blocks data 30529 1726882668.04095: processing included file: fedora.linux_system_roles.network 30529 1726882668.04108: in VariableManager get_vars() 30529 1726882668.04118: done with get_vars() 30529 1726882668.04137: in VariableManager get_vars() 30529 1726882668.04150: done with get_vars() 30529 1726882668.04178: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882668.04255: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882668.04306: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882668.04596: in VariableManager get_vars() 30529 1726882668.04611: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882668.05876: iterating over new_blocks loaded from include file 30529 1726882668.05878: in VariableManager get_vars() 30529 1726882668.05892: done with get_vars() 30529 1726882668.05895: filtering new block on tags 30529 1726882668.06122: done filtering new block on tags 30529 1726882668.06126: in VariableManager get_vars() 30529 1726882668.06140: done with get_vars() 30529 1726882668.06142: filtering new block on tags 30529 1726882668.06158: done filtering new block on tags 30529 1726882668.06160: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882668.06165: extending task lists for all hosts with included blocks 30529 1726882668.06269: done extending task lists 30529 1726882668.06270: done processing included files 30529 1726882668.06271: results queue empty 30529 1726882668.06271: checking for any_errors_fatal 30529 1726882668.06273: done checking for any_errors_fatal 30529 1726882668.06274: checking for max_fail_percentage 30529 1726882668.06275: done checking for max_fail_percentage 30529 1726882668.06275: checking to see if all hosts have failed and the running result is not ok 30529 1726882668.06276: done checking to see if all hosts have failed 30529 1726882668.06277: getting the remaining hosts for this loop 30529 1726882668.06279: done getting the remaining hosts for this loop 30529 1726882668.06283: getting the next task for host managed_node1 30529 1726882668.06287: done getting next task for host managed_node1 30529 1726882668.06290: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882668.06294: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882668.06305: getting variables 30529 1726882668.06306: in VariableManager get_vars() 30529 1726882668.06320: Calling all_inventory to load vars for managed_node1 30529 1726882668.06323: Calling groups_inventory to load vars for managed_node1 30529 1726882668.06324: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.06329: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.06331: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.06334: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.07375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.08306: done with get_vars() 30529 1726882668.08324: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:48 -0400 (0:00:00.075) 0:01:22.109 ****** 30529 1726882668.08376: entering _queue_task() for managed_node1/include_tasks 30529 1726882668.08652: worker is 1 (out of 1 available) 30529 1726882668.08666: exiting _queue_task() for managed_node1/include_tasks 30529 1726882668.08679: done queuing things up, now waiting for results queue to drain 30529 1726882668.08681: waiting for pending results... 30529 1726882668.08922: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882668.09041: in run() - task 12673a56-9f93-b0f1-edc0-000000001b3b 30529 1726882668.09058: variable 'ansible_search_path' from source: unknown 30529 1726882668.09062: variable 'ansible_search_path' from source: unknown 30529 1726882668.09097: calling self._execute() 30529 1726882668.09398: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.09402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.09405: variable 'omit' from source: magic vars 30529 1726882668.09666: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.09683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.09695: _execute() done 30529 1726882668.09705: dumping result to json 30529 1726882668.09712: done dumping result, returning 30529 1726882668.09722: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000001b3b] 30529 1726882668.09731: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3b 30529 1726882668.09902: no more pending results, returning what we have 30529 1726882668.09908: in VariableManager get_vars() 30529 1726882668.10008: Calling all_inventory to load vars for managed_node1 30529 1726882668.10011: Calling groups_inventory to load vars for managed_node1 30529 1726882668.10014: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.10028: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.10032: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.10035: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.10736: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3b 30529 1726882668.10739: WORKER PROCESS EXITING 30529 1726882668.11534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.12489: done with get_vars() 30529 1726882668.12509: variable 'ansible_search_path' from source: unknown 30529 1726882668.12511: variable 'ansible_search_path' from source: unknown 30529 1726882668.12549: we have included files to process 30529 1726882668.12550: generating all_blocks data 30529 1726882668.12552: done generating all_blocks data 30529 1726882668.12555: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882668.12557: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882668.12559: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882668.13132: done processing included file 30529 1726882668.13134: iterating over new_blocks loaded from include file 30529 1726882668.13136: in VariableManager get_vars() 30529 1726882668.13159: done with get_vars() 30529 1726882668.13161: filtering new block on tags 30529 1726882668.13194: done filtering new block on tags 30529 1726882668.13198: in VariableManager get_vars() 30529 1726882668.13222: done with get_vars() 30529 1726882668.13224: filtering new block on tags 30529 1726882668.13267: done filtering new block on tags 30529 1726882668.13270: in VariableManager get_vars() 30529 1726882668.13290: done with get_vars() 30529 1726882668.13292: filtering new block on tags 30529 1726882668.13332: done filtering new block on tags 30529 1726882668.13334: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882668.13339: extending task lists for all hosts with included blocks 30529 1726882668.14881: done extending task lists 30529 1726882668.14883: done processing included files 30529 1726882668.14884: results queue empty 30529 1726882668.14884: checking for any_errors_fatal 30529 1726882668.14888: done checking for any_errors_fatal 30529 1726882668.14889: checking for max_fail_percentage 30529 1726882668.14890: done checking for max_fail_percentage 30529 1726882668.14891: checking to see if all hosts have failed and the running result is not ok 30529 1726882668.14892: done checking to see if all hosts have failed 30529 1726882668.14894: getting the remaining hosts for this loop 30529 1726882668.14895: done getting the remaining hosts for this loop 30529 1726882668.14898: getting the next task for host managed_node1 30529 1726882668.14903: done getting next task for host managed_node1 30529 1726882668.14906: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882668.14910: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882668.14921: getting variables 30529 1726882668.14922: in VariableManager get_vars() 30529 1726882668.14934: Calling all_inventory to load vars for managed_node1 30529 1726882668.14937: Calling groups_inventory to load vars for managed_node1 30529 1726882668.14939: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.14943: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.14946: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.14948: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.16103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.17623: done with get_vars() 30529 1726882668.17645: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:48 -0400 (0:00:00.093) 0:01:22.203 ****** 30529 1726882668.17722: entering _queue_task() for managed_node1/setup 30529 1726882668.18100: worker is 1 (out of 1 available) 30529 1726882668.18114: exiting _queue_task() for managed_node1/setup 30529 1726882668.18128: done queuing things up, now waiting for results queue to drain 30529 1726882668.18129: waiting for pending results... 30529 1726882668.18433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882668.18619: in run() - task 12673a56-9f93-b0f1-edc0-000000001b92 30529 1726882668.18642: variable 'ansible_search_path' from source: unknown 30529 1726882668.18650: variable 'ansible_search_path' from source: unknown 30529 1726882668.18692: calling self._execute() 30529 1726882668.18800: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.18813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.18830: variable 'omit' from source: magic vars 30529 1726882668.19298: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.19302: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.19438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882668.21543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882668.21619: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882668.21946: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882668.21950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882668.21952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882668.22300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882668.22304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882668.22307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882668.22342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882668.22363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882668.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882668.22455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882668.22482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882668.22555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882668.22643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882668.22977: variable '__network_required_facts' from source: role '' defaults 30529 1726882668.22994: variable 'ansible_facts' from source: unknown 30529 1726882668.24612: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882668.24623: when evaluation is False, skipping this task 30529 1726882668.24632: _execute() done 30529 1726882668.24639: dumping result to json 30529 1726882668.24646: done dumping result, returning 30529 1726882668.24658: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000001b92] 30529 1726882668.24668: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b92 30529 1726882668.24787: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b92 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882668.24838: no more pending results, returning what we have 30529 1726882668.24843: results queue empty 30529 1726882668.24844: checking for any_errors_fatal 30529 1726882668.24845: done checking for any_errors_fatal 30529 1726882668.24846: checking for max_fail_percentage 30529 1726882668.24848: done checking for max_fail_percentage 30529 1726882668.24849: checking to see if all hosts have failed and the running result is not ok 30529 1726882668.24850: done checking to see if all hosts have failed 30529 1726882668.24850: getting the remaining hosts for this loop 30529 1726882668.24853: done getting the remaining hosts for this loop 30529 1726882668.24856: getting the next task for host managed_node1 30529 1726882668.24868: done getting next task for host managed_node1 30529 1726882668.24872: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882668.24878: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882668.24906: getting variables 30529 1726882668.24908: in VariableManager get_vars() 30529 1726882668.24953: Calling all_inventory to load vars for managed_node1 30529 1726882668.24956: Calling groups_inventory to load vars for managed_node1 30529 1726882668.24959: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.24971: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.24974: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.24978: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.25782: WORKER PROCESS EXITING 30529 1726882668.26999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.28555: done with get_vars() 30529 1726882668.28581: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:48 -0400 (0:00:00.109) 0:01:22.312 ****** 30529 1726882668.28682: entering _queue_task() for managed_node1/stat 30529 1726882668.29045: worker is 1 (out of 1 available) 30529 1726882668.29058: exiting _queue_task() for managed_node1/stat 30529 1726882668.29073: done queuing things up, now waiting for results queue to drain 30529 1726882668.29074: waiting for pending results... 30529 1726882668.29366: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882668.29549: in run() - task 12673a56-9f93-b0f1-edc0-000000001b94 30529 1726882668.29569: variable 'ansible_search_path' from source: unknown 30529 1726882668.29576: variable 'ansible_search_path' from source: unknown 30529 1726882668.29617: calling self._execute() 30529 1726882668.29723: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.29738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.29752: variable 'omit' from source: magic vars 30529 1726882668.30120: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.30298: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.30302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882668.30579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882668.30635: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882668.30672: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882668.30713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882668.30804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882668.30836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882668.30872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882668.30906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882668.31001: variable '__network_is_ostree' from source: set_fact 30529 1726882668.31015: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882668.31023: when evaluation is False, skipping this task 30529 1726882668.31031: _execute() done 30529 1726882668.31039: dumping result to json 30529 1726882668.31046: done dumping result, returning 30529 1726882668.31058: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000001b94] 30529 1726882668.31074: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b94 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882668.31224: no more pending results, returning what we have 30529 1726882668.31228: results queue empty 30529 1726882668.31229: checking for any_errors_fatal 30529 1726882668.31238: done checking for any_errors_fatal 30529 1726882668.31239: checking for max_fail_percentage 30529 1726882668.31241: done checking for max_fail_percentage 30529 1726882668.31242: checking to see if all hosts have failed and the running result is not ok 30529 1726882668.31243: done checking to see if all hosts have failed 30529 1726882668.31243: getting the remaining hosts for this loop 30529 1726882668.31246: done getting the remaining hosts for this loop 30529 1726882668.31250: getting the next task for host managed_node1 30529 1726882668.31260: done getting next task for host managed_node1 30529 1726882668.31265: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882668.31270: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882668.31301: getting variables 30529 1726882668.31304: in VariableManager get_vars() 30529 1726882668.31349: Calling all_inventory to load vars for managed_node1 30529 1726882668.31352: Calling groups_inventory to load vars for managed_node1 30529 1726882668.31354: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.31365: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.31369: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.31372: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.32248: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b94 30529 1726882668.32251: WORKER PROCESS EXITING 30529 1726882668.34142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.37327: done with get_vars() 30529 1726882668.37359: done getting variables 30529 1726882668.37423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:48 -0400 (0:00:00.087) 0:01:22.400 ****** 30529 1726882668.37461: entering _queue_task() for managed_node1/set_fact 30529 1726882668.38050: worker is 1 (out of 1 available) 30529 1726882668.38063: exiting _queue_task() for managed_node1/set_fact 30529 1726882668.38079: done queuing things up, now waiting for results queue to drain 30529 1726882668.38081: waiting for pending results... 30529 1726882668.38381: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882668.38552: in run() - task 12673a56-9f93-b0f1-edc0-000000001b95 30529 1726882668.38575: variable 'ansible_search_path' from source: unknown 30529 1726882668.38584: variable 'ansible_search_path' from source: unknown 30529 1726882668.38634: calling self._execute() 30529 1726882668.38741: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.38750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.38762: variable 'omit' from source: magic vars 30529 1726882668.39300: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.39304: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.39307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882668.39584: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882668.39645: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882668.39689: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882668.39736: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882668.39817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882668.39851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882668.39888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882668.39923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882668.40021: variable '__network_is_ostree' from source: set_fact 30529 1726882668.40033: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882668.40040: when evaluation is False, skipping this task 30529 1726882668.40046: _execute() done 30529 1726882668.40054: dumping result to json 30529 1726882668.40063: done dumping result, returning 30529 1726882668.40076: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000001b95] 30529 1726882668.40086: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b95 30529 1726882668.40321: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b95 30529 1726882668.40324: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882668.40371: no more pending results, returning what we have 30529 1726882668.40375: results queue empty 30529 1726882668.40376: checking for any_errors_fatal 30529 1726882668.40382: done checking for any_errors_fatal 30529 1726882668.40383: checking for max_fail_percentage 30529 1726882668.40385: done checking for max_fail_percentage 30529 1726882668.40386: checking to see if all hosts have failed and the running result is not ok 30529 1726882668.40387: done checking to see if all hosts have failed 30529 1726882668.40387: getting the remaining hosts for this loop 30529 1726882668.40389: done getting the remaining hosts for this loop 30529 1726882668.40395: getting the next task for host managed_node1 30529 1726882668.40406: done getting next task for host managed_node1 30529 1726882668.40411: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882668.40418: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882668.40447: getting variables 30529 1726882668.40449: in VariableManager get_vars() 30529 1726882668.40495: Calling all_inventory to load vars for managed_node1 30529 1726882668.40498: Calling groups_inventory to load vars for managed_node1 30529 1726882668.40501: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882668.40512: Calling all_plugins_play to load vars for managed_node1 30529 1726882668.40516: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882668.40518: Calling groups_plugins_play to load vars for managed_node1 30529 1726882668.42660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882668.44373: done with get_vars() 30529 1726882668.44398: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:48 -0400 (0:00:00.070) 0:01:22.471 ****** 30529 1726882668.44508: entering _queue_task() for managed_node1/service_facts 30529 1726882668.45026: worker is 1 (out of 1 available) 30529 1726882668.45040: exiting _queue_task() for managed_node1/service_facts 30529 1726882668.45054: done queuing things up, now waiting for results queue to drain 30529 1726882668.45056: waiting for pending results... 30529 1726882668.45714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882668.45847: in run() - task 12673a56-9f93-b0f1-edc0-000000001b97 30529 1726882668.45878: variable 'ansible_search_path' from source: unknown 30529 1726882668.45899: variable 'ansible_search_path' from source: unknown 30529 1726882668.45931: calling self._execute() 30529 1726882668.46055: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.46058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.46164: variable 'omit' from source: magic vars 30529 1726882668.46464: variable 'ansible_distribution_major_version' from source: facts 30529 1726882668.46481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882668.46502: variable 'omit' from source: magic vars 30529 1726882668.46582: variable 'omit' from source: magic vars 30529 1726882668.46627: variable 'omit' from source: magic vars 30529 1726882668.46669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882668.46715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882668.46742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882668.46764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882668.46783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882668.46825: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882668.46834: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.46842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.46954: Set connection var ansible_shell_executable to /bin/sh 30529 1726882668.46965: Set connection var ansible_pipelining to False 30529 1726882668.47034: Set connection var ansible_shell_type to sh 30529 1726882668.47037: Set connection var ansible_timeout to 10 30529 1726882668.47039: Set connection var ansible_connection to ssh 30529 1726882668.47041: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882668.47043: variable 'ansible_shell_executable' from source: unknown 30529 1726882668.47046: variable 'ansible_connection' from source: unknown 30529 1726882668.47048: variable 'ansible_module_compression' from source: unknown 30529 1726882668.47050: variable 'ansible_shell_type' from source: unknown 30529 1726882668.47052: variable 'ansible_shell_executable' from source: unknown 30529 1726882668.47054: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882668.47061: variable 'ansible_pipelining' from source: unknown 30529 1726882668.47068: variable 'ansible_timeout' from source: unknown 30529 1726882668.47076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882668.47280: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882668.47299: variable 'omit' from source: magic vars 30529 1726882668.47310: starting attempt loop 30529 1726882668.47317: running the handler 30529 1726882668.47360: _low_level_execute_command(): starting 30529 1726882668.47363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882668.48062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882668.48078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882668.48107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882668.48130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.48212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.48237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882668.48254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882668.48275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882668.48375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882668.50008: stdout chunk (state=3): >>>/root <<< 30529 1726882668.50152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882668.50182: stdout chunk (state=3): >>><<< 30529 1726882668.50185: stderr chunk (state=3): >>><<< 30529 1726882668.50208: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882668.50304: _low_level_execute_command(): starting 30529 1726882668.50308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991 `" && echo ansible-tmp-1726882668.502153-34478-105002358294991="` echo /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991 `" ) && sleep 0' 30529 1726882668.50999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882668.51021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882668.51035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882668.51114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882668.52968: stdout chunk (state=3): >>>ansible-tmp-1726882668.502153-34478-105002358294991=/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991 <<< 30529 1726882668.53112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882668.53122: stdout chunk (state=3): >>><<< 30529 1726882668.53135: stderr chunk (state=3): >>><<< 30529 1726882668.53158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882668.502153-34478-105002358294991=/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882668.53208: variable 'ansible_module_compression' from source: unknown 30529 1726882668.53260: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882668.53301: variable 'ansible_facts' from source: unknown 30529 1726882668.53399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py 30529 1726882668.53619: Sending initial data 30529 1726882668.53622: Sent initial data (161 bytes) 30529 1726882668.54175: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882668.54286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882668.54310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882668.54325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882668.54402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882668.55896: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882668.55940: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882668.55966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882668.56026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpyoy_5n_v /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py <<< 30529 1726882668.56034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py" <<< 30529 1726882668.56075: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpyoy_5n_v" to remote "/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py" <<< 30529 1726882668.56077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py" <<< 30529 1726882668.56626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882668.56657: stderr chunk (state=3): >>><<< 30529 1726882668.56660: stdout chunk (state=3): >>><<< 30529 1726882668.56675: done transferring module to remote 30529 1726882668.56683: _low_level_execute_command(): starting 30529 1726882668.56686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/ /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py && sleep 0' 30529 1726882668.57074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882668.57109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882668.57112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882668.57114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.57117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882668.57120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.57167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882668.57174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882668.57216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882668.58919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882668.58940: stderr chunk (state=3): >>><<< 30529 1726882668.58943: stdout chunk (state=3): >>><<< 30529 1726882668.58951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882668.59006: _low_level_execute_command(): starting 30529 1726882668.59010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/AnsiballZ_service_facts.py && sleep 0' 30529 1726882668.59347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882668.59351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.59353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882668.59356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882668.59366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882668.59398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882668.59416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882668.59459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.09970: stdout chunk (state=3): >>> <<< 30529 1726882670.10068: stdout chunk (state=3): >>>{"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stop<<< 30529 1726882670.10082: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882670.11587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882670.11590: stdout chunk (state=3): >>><<< 30529 1726882670.11595: stderr chunk (state=3): >>><<< 30529 1726882670.11802: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882670.12427: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882670.12445: _low_level_execute_command(): starting 30529 1726882670.12455: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882668.502153-34478-105002358294991/ > /dev/null 2>&1 && sleep 0' 30529 1726882670.13114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882670.13138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882670.13212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.13266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882670.13286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882670.13311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.13379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.15218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.15221: stdout chunk (state=3): >>><<< 30529 1726882670.15223: stderr chunk (state=3): >>><<< 30529 1726882670.15464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882670.15468: handler run complete 30529 1726882670.15759: variable 'ansible_facts' from source: unknown 30529 1726882670.16061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882670.16995: variable 'ansible_facts' from source: unknown 30529 1726882670.17141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882670.17365: attempt loop complete, returning result 30529 1726882670.17377: _execute() done 30529 1726882670.17388: dumping result to json 30529 1726882670.17460: done dumping result, returning 30529 1726882670.17474: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000001b97] 30529 1726882670.17484: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b97 30529 1726882670.18612: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b97 30529 1726882670.18616: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882670.18898: no more pending results, returning what we have 30529 1726882670.18902: results queue empty 30529 1726882670.18904: checking for any_errors_fatal 30529 1726882670.18917: done checking for any_errors_fatal 30529 1726882670.18918: checking for max_fail_percentage 30529 1726882670.18920: done checking for max_fail_percentage 30529 1726882670.18921: checking to see if all hosts have failed and the running result is not ok 30529 1726882670.18922: done checking to see if all hosts have failed 30529 1726882670.18923: getting the remaining hosts for this loop 30529 1726882670.18925: done getting the remaining hosts for this loop 30529 1726882670.18928: getting the next task for host managed_node1 30529 1726882670.18936: done getting next task for host managed_node1 30529 1726882670.18940: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882670.18946: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882670.18959: getting variables 30529 1726882670.18960: in VariableManager get_vars() 30529 1726882670.19469: Calling all_inventory to load vars for managed_node1 30529 1726882670.19473: Calling groups_inventory to load vars for managed_node1 30529 1726882670.19476: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882670.19485: Calling all_plugins_play to load vars for managed_node1 30529 1726882670.19491: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882670.19496: Calling groups_plugins_play to load vars for managed_node1 30529 1726882670.20890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882670.28356: done with get_vars() 30529 1726882670.28381: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:50 -0400 (0:00:01.839) 0:01:24.310 ****** 30529 1726882670.28472: entering _queue_task() for managed_node1/package_facts 30529 1726882670.28849: worker is 1 (out of 1 available) 30529 1726882670.28866: exiting _queue_task() for managed_node1/package_facts 30529 1726882670.28882: done queuing things up, now waiting for results queue to drain 30529 1726882670.28884: waiting for pending results... 30529 1726882670.29082: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882670.29198: in run() - task 12673a56-9f93-b0f1-edc0-000000001b98 30529 1726882670.29208: variable 'ansible_search_path' from source: unknown 30529 1726882670.29216: variable 'ansible_search_path' from source: unknown 30529 1726882670.29241: calling self._execute() 30529 1726882670.29319: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882670.29327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882670.29333: variable 'omit' from source: magic vars 30529 1726882670.29622: variable 'ansible_distribution_major_version' from source: facts 30529 1726882670.29631: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882670.29637: variable 'omit' from source: magic vars 30529 1726882670.29698: variable 'omit' from source: magic vars 30529 1726882670.29724: variable 'omit' from source: magic vars 30529 1726882670.29754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882670.29783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882670.29802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882670.29816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882670.29827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882670.29852: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882670.29856: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882670.29858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882670.29932: Set connection var ansible_shell_executable to /bin/sh 30529 1726882670.29936: Set connection var ansible_pipelining to False 30529 1726882670.29939: Set connection var ansible_shell_type to sh 30529 1726882670.29946: Set connection var ansible_timeout to 10 30529 1726882670.29949: Set connection var ansible_connection to ssh 30529 1726882670.29954: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882670.29972: variable 'ansible_shell_executable' from source: unknown 30529 1726882670.29974: variable 'ansible_connection' from source: unknown 30529 1726882670.29978: variable 'ansible_module_compression' from source: unknown 30529 1726882670.29982: variable 'ansible_shell_type' from source: unknown 30529 1726882670.29984: variable 'ansible_shell_executable' from source: unknown 30529 1726882670.29986: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882670.29992: variable 'ansible_pipelining' from source: unknown 30529 1726882670.29996: variable 'ansible_timeout' from source: unknown 30529 1726882670.29998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882670.30138: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882670.30147: variable 'omit' from source: magic vars 30529 1726882670.30152: starting attempt loop 30529 1726882670.30155: running the handler 30529 1726882670.30170: _low_level_execute_command(): starting 30529 1726882670.30176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882670.30901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882670.30906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882670.30922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.30944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.30976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.32622: stdout chunk (state=3): >>>/root <<< 30529 1726882670.32725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.32751: stderr chunk (state=3): >>><<< 30529 1726882670.32755: stdout chunk (state=3): >>><<< 30529 1726882670.32773: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882670.32785: _low_level_execute_command(): starting 30529 1726882670.32791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137 `" && echo ansible-tmp-1726882670.3277261-34530-78668723580137="` echo /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137 `" ) && sleep 0' 30529 1726882670.33192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882670.33198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882670.33225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.33228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882670.33237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.33285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882670.33297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.33336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.35192: stdout chunk (state=3): >>>ansible-tmp-1726882670.3277261-34530-78668723580137=/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137 <<< 30529 1726882670.35298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.35323: stderr chunk (state=3): >>><<< 30529 1726882670.35327: stdout chunk (state=3): >>><<< 30529 1726882670.35340: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882670.3277261-34530-78668723580137=/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882670.35376: variable 'ansible_module_compression' from source: unknown 30529 1726882670.35416: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882670.35467: variable 'ansible_facts' from source: unknown 30529 1726882670.35584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py 30529 1726882670.35683: Sending initial data 30529 1726882670.35686: Sent initial data (161 bytes) 30529 1726882670.36075: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882670.36110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882670.36113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.36115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.36160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882670.36164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.36215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.37714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30529 1726882670.37719: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882670.37753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882670.37796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_i8gsdov /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py <<< 30529 1726882670.37799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py" <<< 30529 1726882670.37840: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30529 1726882670.37844: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_i8gsdov" to remote "/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py" <<< 30529 1726882670.38860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.38898: stderr chunk (state=3): >>><<< 30529 1726882670.38901: stdout chunk (state=3): >>><<< 30529 1726882670.38930: done transferring module to remote 30529 1726882670.38939: _low_level_execute_command(): starting 30529 1726882670.38944: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/ /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py && sleep 0' 30529 1726882670.39370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882670.39374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882670.39376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882670.39378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.39427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882670.39435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.39475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.41198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.41218: stderr chunk (state=3): >>><<< 30529 1726882670.41222: stdout chunk (state=3): >>><<< 30529 1726882670.41232: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882670.41235: _low_level_execute_command(): starting 30529 1726882670.41240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/AnsiballZ_package_facts.py && sleep 0' 30529 1726882670.41655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882670.41659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.41662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882670.41664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882670.41665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.41712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882670.41716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.41766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.85968: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882670.85984: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30529 1726882670.86048: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30529 1726882670.86064: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30529 1726882670.86117: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882670.86140: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30529 1726882670.86158: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882670.86166: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882670.87959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882670.87962: stdout chunk (state=3): >>><<< 30529 1726882670.87965: stderr chunk (state=3): >>><<< 30529 1726882670.88205: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882670.89332: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882670.89348: _low_level_execute_command(): starting 30529 1726882670.89352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882670.3277261-34530-78668723580137/ > /dev/null 2>&1 && sleep 0' 30529 1726882670.89769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882670.89776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882670.89799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.89802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882670.89816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882670.89818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882670.89868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882670.89872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882670.89876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882670.89918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882670.91726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882670.91755: stderr chunk (state=3): >>><<< 30529 1726882670.91757: stdout chunk (state=3): >>><<< 30529 1726882670.91768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882670.91798: handler run complete 30529 1726882670.92328: variable 'ansible_facts' from source: unknown 30529 1726882670.92589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882670.94191: variable 'ansible_facts' from source: unknown 30529 1726882670.94511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882670.95758: attempt loop complete, returning result 30529 1726882670.95762: _execute() done 30529 1726882670.95764: dumping result to json 30529 1726882670.95976: done dumping result, returning 30529 1726882670.95980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000001b98] 30529 1726882670.95983: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b98 30529 1726882670.99062: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b98 30529 1726882670.99066: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882670.99217: no more pending results, returning what we have 30529 1726882670.99221: results queue empty 30529 1726882670.99222: checking for any_errors_fatal 30529 1726882670.99229: done checking for any_errors_fatal 30529 1726882670.99229: checking for max_fail_percentage 30529 1726882670.99231: done checking for max_fail_percentage 30529 1726882670.99232: checking to see if all hosts have failed and the running result is not ok 30529 1726882670.99233: done checking to see if all hosts have failed 30529 1726882670.99233: getting the remaining hosts for this loop 30529 1726882670.99235: done getting the remaining hosts for this loop 30529 1726882670.99239: getting the next task for host managed_node1 30529 1726882670.99247: done getting next task for host managed_node1 30529 1726882670.99250: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882670.99256: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882670.99270: getting variables 30529 1726882670.99271: in VariableManager get_vars() 30529 1726882670.99509: Calling all_inventory to load vars for managed_node1 30529 1726882670.99517: Calling groups_inventory to load vars for managed_node1 30529 1726882670.99520: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882670.99530: Calling all_plugins_play to load vars for managed_node1 30529 1726882670.99533: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882670.99536: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.01230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.02855: done with get_vars() 30529 1726882671.02882: done getting variables 30529 1726882671.02950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:51 -0400 (0:00:00.745) 0:01:25.055 ****** 30529 1726882671.02988: entering _queue_task() for managed_node1/debug 30529 1726882671.03372: worker is 1 (out of 1 available) 30529 1726882671.03387: exiting _queue_task() for managed_node1/debug 30529 1726882671.03547: done queuing things up, now waiting for results queue to drain 30529 1726882671.03550: waiting for pending results... 30529 1726882671.03776: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882671.03889: in run() - task 12673a56-9f93-b0f1-edc0-000000001b3c 30529 1726882671.03915: variable 'ansible_search_path' from source: unknown 30529 1726882671.03981: variable 'ansible_search_path' from source: unknown 30529 1726882671.03985: calling self._execute() 30529 1726882671.04071: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.04086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.04105: variable 'omit' from source: magic vars 30529 1726882671.04530: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.04553: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.04564: variable 'omit' from source: magic vars 30529 1726882671.04635: variable 'omit' from source: magic vars 30529 1726882671.04746: variable 'network_provider' from source: set_fact 30529 1726882671.04853: variable 'omit' from source: magic vars 30529 1726882671.04857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882671.04863: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882671.04895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882671.04919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882671.04936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882671.04978: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882671.04989: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.05001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.05119: Set connection var ansible_shell_executable to /bin/sh 30529 1726882671.05180: Set connection var ansible_pipelining to False 30529 1726882671.05183: Set connection var ansible_shell_type to sh 30529 1726882671.05186: Set connection var ansible_timeout to 10 30529 1726882671.05188: Set connection var ansible_connection to ssh 30529 1726882671.05190: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882671.05192: variable 'ansible_shell_executable' from source: unknown 30529 1726882671.05196: variable 'ansible_connection' from source: unknown 30529 1726882671.05203: variable 'ansible_module_compression' from source: unknown 30529 1726882671.05210: variable 'ansible_shell_type' from source: unknown 30529 1726882671.05289: variable 'ansible_shell_executable' from source: unknown 30529 1726882671.05295: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.05298: variable 'ansible_pipelining' from source: unknown 30529 1726882671.05300: variable 'ansible_timeout' from source: unknown 30529 1726882671.05302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.05389: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882671.05418: variable 'omit' from source: magic vars 30529 1726882671.05429: starting attempt loop 30529 1726882671.05437: running the handler 30529 1726882671.05486: handler run complete 30529 1726882671.05527: attempt loop complete, returning result 30529 1726882671.05530: _execute() done 30529 1726882671.05533: dumping result to json 30529 1726882671.05535: done dumping result, returning 30529 1726882671.05599: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000001b3c] 30529 1726882671.05602: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3c ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882671.05794: no more pending results, returning what we have 30529 1726882671.05798: results queue empty 30529 1726882671.05800: checking for any_errors_fatal 30529 1726882671.05811: done checking for any_errors_fatal 30529 1726882671.05812: checking for max_fail_percentage 30529 1726882671.05814: done checking for max_fail_percentage 30529 1726882671.05815: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.05816: done checking to see if all hosts have failed 30529 1726882671.05817: getting the remaining hosts for this loop 30529 1726882671.05819: done getting the remaining hosts for this loop 30529 1726882671.05823: getting the next task for host managed_node1 30529 1726882671.05834: done getting next task for host managed_node1 30529 1726882671.05838: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882671.05844: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.05860: getting variables 30529 1726882671.05862: in VariableManager get_vars() 30529 1726882671.06110: Calling all_inventory to load vars for managed_node1 30529 1726882671.06113: Calling groups_inventory to load vars for managed_node1 30529 1726882671.06116: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.06126: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.06129: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.06133: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.06718: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3c 30529 1726882671.06721: WORKER PROCESS EXITING 30529 1726882671.07573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.09186: done with get_vars() 30529 1726882671.09211: done getting variables 30529 1726882671.09278: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:51 -0400 (0:00:00.063) 0:01:25.119 ****** 30529 1726882671.09325: entering _queue_task() for managed_node1/fail 30529 1726882671.09674: worker is 1 (out of 1 available) 30529 1726882671.09801: exiting _queue_task() for managed_node1/fail 30529 1726882671.09813: done queuing things up, now waiting for results queue to drain 30529 1726882671.09815: waiting for pending results... 30529 1726882671.10006: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882671.10154: in run() - task 12673a56-9f93-b0f1-edc0-000000001b3d 30529 1726882671.10234: variable 'ansible_search_path' from source: unknown 30529 1726882671.10238: variable 'ansible_search_path' from source: unknown 30529 1726882671.10242: calling self._execute() 30529 1726882671.10333: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.10350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.10371: variable 'omit' from source: magic vars 30529 1726882671.10797: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.10819: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.10956: variable 'network_state' from source: role '' defaults 30529 1726882671.11000: Evaluated conditional (network_state != {}): False 30529 1726882671.11004: when evaluation is False, skipping this task 30529 1726882671.11006: _execute() done 30529 1726882671.11009: dumping result to json 30529 1726882671.11011: done dumping result, returning 30529 1726882671.11097: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000001b3d] 30529 1726882671.11104: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3d 30529 1726882671.11178: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3d 30529 1726882671.11181: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882671.11234: no more pending results, returning what we have 30529 1726882671.11239: results queue empty 30529 1726882671.11240: checking for any_errors_fatal 30529 1726882671.11248: done checking for any_errors_fatal 30529 1726882671.11249: checking for max_fail_percentage 30529 1726882671.11251: done checking for max_fail_percentage 30529 1726882671.11252: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.11253: done checking to see if all hosts have failed 30529 1726882671.11254: getting the remaining hosts for this loop 30529 1726882671.11256: done getting the remaining hosts for this loop 30529 1726882671.11260: getting the next task for host managed_node1 30529 1726882671.11269: done getting next task for host managed_node1 30529 1726882671.11273: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882671.11280: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.11430: getting variables 30529 1726882671.11432: in VariableManager get_vars() 30529 1726882671.11473: Calling all_inventory to load vars for managed_node1 30529 1726882671.11476: Calling groups_inventory to load vars for managed_node1 30529 1726882671.11478: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.11489: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.11595: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.11601: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.13224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.14830: done with get_vars() 30529 1726882671.14858: done getting variables 30529 1726882671.14921: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:51 -0400 (0:00:00.056) 0:01:25.175 ****** 30529 1726882671.14957: entering _queue_task() for managed_node1/fail 30529 1726882671.15412: worker is 1 (out of 1 available) 30529 1726882671.15430: exiting _queue_task() for managed_node1/fail 30529 1726882671.15441: done queuing things up, now waiting for results queue to drain 30529 1726882671.15443: waiting for pending results... 30529 1726882671.15875: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882671.15880: in run() - task 12673a56-9f93-b0f1-edc0-000000001b3e 30529 1726882671.15883: variable 'ansible_search_path' from source: unknown 30529 1726882671.15886: variable 'ansible_search_path' from source: unknown 30529 1726882671.15907: calling self._execute() 30529 1726882671.16020: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.16032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.16048: variable 'omit' from source: magic vars 30529 1726882671.16451: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.16469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.16600: variable 'network_state' from source: role '' defaults 30529 1726882671.16622: Evaluated conditional (network_state != {}): False 30529 1726882671.16635: when evaluation is False, skipping this task 30529 1726882671.16698: _execute() done 30529 1726882671.16702: dumping result to json 30529 1726882671.16704: done dumping result, returning 30529 1726882671.16707: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000001b3e] 30529 1726882671.16709: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3e 30529 1726882671.16843: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3e 30529 1726882671.16846: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882671.17099: no more pending results, returning what we have 30529 1726882671.17104: results queue empty 30529 1726882671.17105: checking for any_errors_fatal 30529 1726882671.17111: done checking for any_errors_fatal 30529 1726882671.17112: checking for max_fail_percentage 30529 1726882671.17114: done checking for max_fail_percentage 30529 1726882671.17114: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.17115: done checking to see if all hosts have failed 30529 1726882671.17116: getting the remaining hosts for this loop 30529 1726882671.17118: done getting the remaining hosts for this loop 30529 1726882671.17121: getting the next task for host managed_node1 30529 1726882671.17130: done getting next task for host managed_node1 30529 1726882671.17133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882671.17139: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.17162: getting variables 30529 1726882671.17164: in VariableManager get_vars() 30529 1726882671.17203: Calling all_inventory to load vars for managed_node1 30529 1726882671.17206: Calling groups_inventory to load vars for managed_node1 30529 1726882671.17208: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.17220: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.17223: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.17226: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.18637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.20281: done with get_vars() 30529 1726882671.20305: done getting variables 30529 1726882671.20367: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:51 -0400 (0:00:00.054) 0:01:25.230 ****** 30529 1726882671.20403: entering _queue_task() for managed_node1/fail 30529 1726882671.20732: worker is 1 (out of 1 available) 30529 1726882671.20745: exiting _queue_task() for managed_node1/fail 30529 1726882671.20758: done queuing things up, now waiting for results queue to drain 30529 1726882671.20760: waiting for pending results... 30529 1726882671.21111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882671.21225: in run() - task 12673a56-9f93-b0f1-edc0-000000001b3f 30529 1726882671.21245: variable 'ansible_search_path' from source: unknown 30529 1726882671.21254: variable 'ansible_search_path' from source: unknown 30529 1726882671.21314: calling self._execute() 30529 1726882671.21404: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.21415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.21534: variable 'omit' from source: magic vars 30529 1726882671.21835: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.21858: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.22055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.24846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.24928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.24972: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.25011: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.25047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.25133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.25170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.25240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.25260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.25282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.25391: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.25416: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882671.25567: variable 'ansible_distribution' from source: facts 30529 1726882671.25570: variable '__network_rh_distros' from source: role '' defaults 30529 1726882671.25572: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882671.25858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.25899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.26003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.26007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.26010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.26043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.26069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.26110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.26148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.26163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.26204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.26235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.26257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.26294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.26313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.26642: variable 'network_connections' from source: include params 30529 1726882671.26669: variable 'interface' from source: play vars 30529 1726882671.26762: variable 'interface' from source: play vars 30529 1726882671.26765: variable 'network_state' from source: role '' defaults 30529 1726882671.26822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882671.27000: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882671.27041: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882671.27088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882671.27117: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882671.27199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882671.27202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882671.27234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.27264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882671.27304: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882671.27398: when evaluation is False, skipping this task 30529 1726882671.27403: _execute() done 30529 1726882671.27405: dumping result to json 30529 1726882671.27407: done dumping result, returning 30529 1726882671.27410: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000001b3f] 30529 1726882671.27412: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3f 30529 1726882671.27473: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b3f 30529 1726882671.27476: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882671.27522: no more pending results, returning what we have 30529 1726882671.27526: results queue empty 30529 1726882671.27527: checking for any_errors_fatal 30529 1726882671.27532: done checking for any_errors_fatal 30529 1726882671.27533: checking for max_fail_percentage 30529 1726882671.27535: done checking for max_fail_percentage 30529 1726882671.27536: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.27537: done checking to see if all hosts have failed 30529 1726882671.27537: getting the remaining hosts for this loop 30529 1726882671.27539: done getting the remaining hosts for this loop 30529 1726882671.27543: getting the next task for host managed_node1 30529 1726882671.27551: done getting next task for host managed_node1 30529 1726882671.27554: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882671.27560: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.27585: getting variables 30529 1726882671.27587: in VariableManager get_vars() 30529 1726882671.27633: Calling all_inventory to load vars for managed_node1 30529 1726882671.27636: Calling groups_inventory to load vars for managed_node1 30529 1726882671.27639: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.27650: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.27653: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.27657: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.29627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.32049: done with get_vars() 30529 1726882671.32076: done getting variables 30529 1726882671.32253: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:51 -0400 (0:00:00.118) 0:01:25.348 ****** 30529 1726882671.32288: entering _queue_task() for managed_node1/dnf 30529 1726882671.32978: worker is 1 (out of 1 available) 30529 1726882671.32990: exiting _queue_task() for managed_node1/dnf 30529 1726882671.33006: done queuing things up, now waiting for results queue to drain 30529 1726882671.33008: waiting for pending results... 30529 1726882671.33410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882671.33448: in run() - task 12673a56-9f93-b0f1-edc0-000000001b40 30529 1726882671.33463: variable 'ansible_search_path' from source: unknown 30529 1726882671.33467: variable 'ansible_search_path' from source: unknown 30529 1726882671.33599: calling self._execute() 30529 1726882671.33608: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.33611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.33618: variable 'omit' from source: magic vars 30529 1726882671.34015: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.34027: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.34458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.38503: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.38609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.38613: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.38616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.38638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.38799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.38802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.38804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.38825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.38839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.38962: variable 'ansible_distribution' from source: facts 30529 1726882671.38966: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.38984: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882671.39109: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882671.39240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.39261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.39287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.39328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.39342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.39380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.39408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.39431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.39525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.39529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.39531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.39541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.39566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.39607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.39798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.39801: variable 'network_connections' from source: include params 30529 1726882671.39803: variable 'interface' from source: play vars 30529 1726882671.39850: variable 'interface' from source: play vars 30529 1726882671.39919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882671.40300: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882671.40303: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882671.40306: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882671.40308: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882671.40310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882671.40312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882671.40323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.40325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882671.40332: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882671.40586: variable 'network_connections' from source: include params 30529 1726882671.40595: variable 'interface' from source: play vars 30529 1726882671.40650: variable 'interface' from source: play vars 30529 1726882671.40673: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882671.40677: when evaluation is False, skipping this task 30529 1726882671.40679: _execute() done 30529 1726882671.40682: dumping result to json 30529 1726882671.40684: done dumping result, returning 30529 1726882671.40696: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001b40] 30529 1726882671.40699: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b40 30529 1726882671.40797: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b40 30529 1726882671.40800: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882671.40883: no more pending results, returning what we have 30529 1726882671.40887: results queue empty 30529 1726882671.40888: checking for any_errors_fatal 30529 1726882671.40895: done checking for any_errors_fatal 30529 1726882671.40896: checking for max_fail_percentage 30529 1726882671.40898: done checking for max_fail_percentage 30529 1726882671.40899: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.40900: done checking to see if all hosts have failed 30529 1726882671.40900: getting the remaining hosts for this loop 30529 1726882671.40902: done getting the remaining hosts for this loop 30529 1726882671.40906: getting the next task for host managed_node1 30529 1726882671.40914: done getting next task for host managed_node1 30529 1726882671.40918: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882671.40924: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.40949: getting variables 30529 1726882671.40951: in VariableManager get_vars() 30529 1726882671.40989: Calling all_inventory to load vars for managed_node1 30529 1726882671.40992: Calling groups_inventory to load vars for managed_node1 30529 1726882671.41134: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.41144: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.41147: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.41149: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.42508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.45103: done with get_vars() 30529 1726882671.45131: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882671.45212: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:51 -0400 (0:00:00.129) 0:01:25.478 ****** 30529 1726882671.45245: entering _queue_task() for managed_node1/yum 30529 1726882671.46042: worker is 1 (out of 1 available) 30529 1726882671.46054: exiting _queue_task() for managed_node1/yum 30529 1726882671.46067: done queuing things up, now waiting for results queue to drain 30529 1726882671.46069: waiting for pending results... 30529 1726882671.46487: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882671.46872: in run() - task 12673a56-9f93-b0f1-edc0-000000001b41 30529 1726882671.47052: variable 'ansible_search_path' from source: unknown 30529 1726882671.47056: variable 'ansible_search_path' from source: unknown 30529 1726882671.47059: calling self._execute() 30529 1726882671.47220: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.47279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.47298: variable 'omit' from source: magic vars 30529 1726882671.48098: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.48102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.48457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.52142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.52219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.52260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.52305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.52336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.52425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.52471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.52509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.52600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.52604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.52673: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.52711: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882671.52722: when evaluation is False, skipping this task 30529 1726882671.52730: _execute() done 30529 1726882671.52737: dumping result to json 30529 1726882671.52755: done dumping result, returning 30529 1726882671.52942: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001b41] 30529 1726882671.52946: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b41 30529 1726882671.53023: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b41 30529 1726882671.53199: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882671.53253: no more pending results, returning what we have 30529 1726882671.53257: results queue empty 30529 1726882671.53258: checking for any_errors_fatal 30529 1726882671.53263: done checking for any_errors_fatal 30529 1726882671.53264: checking for max_fail_percentage 30529 1726882671.53266: done checking for max_fail_percentage 30529 1726882671.53266: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.53267: done checking to see if all hosts have failed 30529 1726882671.53268: getting the remaining hosts for this loop 30529 1726882671.53270: done getting the remaining hosts for this loop 30529 1726882671.53273: getting the next task for host managed_node1 30529 1726882671.53282: done getting next task for host managed_node1 30529 1726882671.53285: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882671.53296: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.53324: getting variables 30529 1726882671.53326: in VariableManager get_vars() 30529 1726882671.53367: Calling all_inventory to load vars for managed_node1 30529 1726882671.53369: Calling groups_inventory to load vars for managed_node1 30529 1726882671.53372: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.53382: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.53384: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.53389: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.55356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.57769: done with get_vars() 30529 1726882671.57905: done getting variables 30529 1726882671.58007: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:51 -0400 (0:00:00.128) 0:01:25.607 ****** 30529 1726882671.58121: entering _queue_task() for managed_node1/fail 30529 1726882671.59329: worker is 1 (out of 1 available) 30529 1726882671.59340: exiting _queue_task() for managed_node1/fail 30529 1726882671.59351: done queuing things up, now waiting for results queue to drain 30529 1726882671.59353: waiting for pending results... 30529 1726882671.59612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882671.59803: in run() - task 12673a56-9f93-b0f1-edc0-000000001b42 30529 1726882671.59806: variable 'ansible_search_path' from source: unknown 30529 1726882671.59809: variable 'ansible_search_path' from source: unknown 30529 1726882671.59820: calling self._execute() 30529 1726882671.59931: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.59942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.59956: variable 'omit' from source: magic vars 30529 1726882671.60454: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.60458: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.60503: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882671.60849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.64854: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.65100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.65105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.65236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.65260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.65392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.65961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.65998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.66047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.66067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.66128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.66156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.66185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.66236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.66255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.66302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.66336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.66363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.66408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.66498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.66609: variable 'network_connections' from source: include params 30529 1726882671.66626: variable 'interface' from source: play vars 30529 1726882671.66705: variable 'interface' from source: play vars 30529 1726882671.66786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882671.66960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882671.67009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882671.67042: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882671.67075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882671.67129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882671.67157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882671.67194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.67312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882671.67316: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882671.67602: variable 'network_connections' from source: include params 30529 1726882671.67614: variable 'interface' from source: play vars 30529 1726882671.67682: variable 'interface' from source: play vars 30529 1726882671.67717: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882671.67726: when evaluation is False, skipping this task 30529 1726882671.67739: _execute() done 30529 1726882671.67799: dumping result to json 30529 1726882671.67803: done dumping result, returning 30529 1726882671.67805: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001b42] 30529 1726882671.67808: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b42 30529 1726882671.68100: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b42 30529 1726882671.68104: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882671.68155: no more pending results, returning what we have 30529 1726882671.68158: results queue empty 30529 1726882671.68159: checking for any_errors_fatal 30529 1726882671.68166: done checking for any_errors_fatal 30529 1726882671.68167: checking for max_fail_percentage 30529 1726882671.68169: done checking for max_fail_percentage 30529 1726882671.68170: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.68171: done checking to see if all hosts have failed 30529 1726882671.68171: getting the remaining hosts for this loop 30529 1726882671.68173: done getting the remaining hosts for this loop 30529 1726882671.68177: getting the next task for host managed_node1 30529 1726882671.68186: done getting next task for host managed_node1 30529 1726882671.68190: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882671.68197: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.68224: getting variables 30529 1726882671.68226: in VariableManager get_vars() 30529 1726882671.68268: Calling all_inventory to load vars for managed_node1 30529 1726882671.68271: Calling groups_inventory to load vars for managed_node1 30529 1726882671.68273: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.68282: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.68285: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.68288: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.70042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.71536: done with get_vars() 30529 1726882671.71559: done getting variables 30529 1726882671.71621: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:51 -0400 (0:00:00.135) 0:01:25.742 ****** 30529 1726882671.71658: entering _queue_task() for managed_node1/package 30529 1726882671.72020: worker is 1 (out of 1 available) 30529 1726882671.72033: exiting _queue_task() for managed_node1/package 30529 1726882671.72046: done queuing things up, now waiting for results queue to drain 30529 1726882671.72048: waiting for pending results... 30529 1726882671.72340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882671.72502: in run() - task 12673a56-9f93-b0f1-edc0-000000001b43 30529 1726882671.72526: variable 'ansible_search_path' from source: unknown 30529 1726882671.72535: variable 'ansible_search_path' from source: unknown 30529 1726882671.72574: calling self._execute() 30529 1726882671.72681: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.72695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.72712: variable 'omit' from source: magic vars 30529 1726882671.73108: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.73126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.73331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882671.73616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882671.73664: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882671.73715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882671.73779: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882671.74098: variable 'network_packages' from source: role '' defaults 30529 1726882671.74101: variable '__network_provider_setup' from source: role '' defaults 30529 1726882671.74103: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882671.74105: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882671.74107: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882671.74157: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882671.74351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.76353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.76425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.76465: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.76507: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.76539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.76623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.76657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.76688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.76739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.76759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.76809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.76844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.76874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.76920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.76944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.77182: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882671.77300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.77329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.77358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.77477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.77481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.77515: variable 'ansible_python' from source: facts 30529 1726882671.77537: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882671.77624: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882671.77711: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882671.77840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.77869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.77901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.77949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.77969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.78029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.78099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.78102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.78140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.78160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.78308: variable 'network_connections' from source: include params 30529 1726882671.78320: variable 'interface' from source: play vars 30529 1726882671.78455: variable 'interface' from source: play vars 30529 1726882671.78515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882671.78547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882671.78586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.78625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882671.78782: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882671.78970: variable 'network_connections' from source: include params 30529 1726882671.78980: variable 'interface' from source: play vars 30529 1726882671.79082: variable 'interface' from source: play vars 30529 1726882671.79125: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882671.79207: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882671.79503: variable 'network_connections' from source: include params 30529 1726882671.79511: variable 'interface' from source: play vars 30529 1726882671.79571: variable 'interface' from source: play vars 30529 1726882671.79599: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882671.79730: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882671.80050: variable 'network_connections' from source: include params 30529 1726882671.80061: variable 'interface' from source: play vars 30529 1726882671.80131: variable 'interface' from source: play vars 30529 1726882671.80188: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882671.80259: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882671.80271: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882671.80340: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882671.80634: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882671.81059: variable 'network_connections' from source: include params 30529 1726882671.81074: variable 'interface' from source: play vars 30529 1726882671.81136: variable 'interface' from source: play vars 30529 1726882671.81149: variable 'ansible_distribution' from source: facts 30529 1726882671.81157: variable '__network_rh_distros' from source: role '' defaults 30529 1726882671.81168: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.81191: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882671.81529: variable 'ansible_distribution' from source: facts 30529 1726882671.81532: variable '__network_rh_distros' from source: role '' defaults 30529 1726882671.81534: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.81536: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882671.81574: variable 'ansible_distribution' from source: facts 30529 1726882671.81577: variable '__network_rh_distros' from source: role '' defaults 30529 1726882671.81583: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.81627: variable 'network_provider' from source: set_fact 30529 1726882671.81641: variable 'ansible_facts' from source: unknown 30529 1726882671.82353: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882671.82356: when evaluation is False, skipping this task 30529 1726882671.82359: _execute() done 30529 1726882671.82361: dumping result to json 30529 1726882671.82363: done dumping result, returning 30529 1726882671.82372: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000001b43] 30529 1726882671.82375: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b43 30529 1726882671.82475: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b43 30529 1726882671.82478: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882671.82543: no more pending results, returning what we have 30529 1726882671.82547: results queue empty 30529 1726882671.82548: checking for any_errors_fatal 30529 1726882671.82554: done checking for any_errors_fatal 30529 1726882671.82555: checking for max_fail_percentage 30529 1726882671.82557: done checking for max_fail_percentage 30529 1726882671.82557: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.82558: done checking to see if all hosts have failed 30529 1726882671.82559: getting the remaining hosts for this loop 30529 1726882671.82561: done getting the remaining hosts for this loop 30529 1726882671.82564: getting the next task for host managed_node1 30529 1726882671.82573: done getting next task for host managed_node1 30529 1726882671.82577: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882671.82582: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.82612: getting variables 30529 1726882671.82614: in VariableManager get_vars() 30529 1726882671.82658: Calling all_inventory to load vars for managed_node1 30529 1726882671.82660: Calling groups_inventory to load vars for managed_node1 30529 1726882671.82662: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.82672: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.82674: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.82677: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.83507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.84813: done with get_vars() 30529 1726882671.84845: done getting variables 30529 1726882671.84915: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:51 -0400 (0:00:00.132) 0:01:25.875 ****** 30529 1726882671.84954: entering _queue_task() for managed_node1/package 30529 1726882671.85345: worker is 1 (out of 1 available) 30529 1726882671.85358: exiting _queue_task() for managed_node1/package 30529 1726882671.85370: done queuing things up, now waiting for results queue to drain 30529 1726882671.85372: waiting for pending results... 30529 1726882671.85814: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882671.85821: in run() - task 12673a56-9f93-b0f1-edc0-000000001b44 30529 1726882671.85826: variable 'ansible_search_path' from source: unknown 30529 1726882671.85838: variable 'ansible_search_path' from source: unknown 30529 1726882671.85872: calling self._execute() 30529 1726882671.85978: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.85983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.85999: variable 'omit' from source: magic vars 30529 1726882671.86386: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.86403: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.86514: variable 'network_state' from source: role '' defaults 30529 1726882671.86524: Evaluated conditional (network_state != {}): False 30529 1726882671.86527: when evaluation is False, skipping this task 30529 1726882671.86530: _execute() done 30529 1726882671.86533: dumping result to json 30529 1726882671.86535: done dumping result, returning 30529 1726882671.86544: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001b44] 30529 1726882671.86549: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b44 30529 1726882671.86656: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b44 30529 1726882671.86660: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882671.86715: no more pending results, returning what we have 30529 1726882671.86719: results queue empty 30529 1726882671.86720: checking for any_errors_fatal 30529 1726882671.86729: done checking for any_errors_fatal 30529 1726882671.86730: checking for max_fail_percentage 30529 1726882671.86732: done checking for max_fail_percentage 30529 1726882671.86732: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.86733: done checking to see if all hosts have failed 30529 1726882671.86734: getting the remaining hosts for this loop 30529 1726882671.86736: done getting the remaining hosts for this loop 30529 1726882671.86740: getting the next task for host managed_node1 30529 1726882671.86749: done getting next task for host managed_node1 30529 1726882671.86753: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882671.86760: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.86796: getting variables 30529 1726882671.86799: in VariableManager get_vars() 30529 1726882671.86844: Calling all_inventory to load vars for managed_node1 30529 1726882671.86846: Calling groups_inventory to load vars for managed_node1 30529 1726882671.86849: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.86861: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.86864: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.86867: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.88494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.89364: done with get_vars() 30529 1726882671.89381: done getting variables 30529 1726882671.89428: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:51 -0400 (0:00:00.045) 0:01:25.920 ****** 30529 1726882671.89455: entering _queue_task() for managed_node1/package 30529 1726882671.89723: worker is 1 (out of 1 available) 30529 1726882671.89736: exiting _queue_task() for managed_node1/package 30529 1726882671.89749: done queuing things up, now waiting for results queue to drain 30529 1726882671.89751: waiting for pending results... 30529 1726882671.89946: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882671.90103: in run() - task 12673a56-9f93-b0f1-edc0-000000001b45 30529 1726882671.90108: variable 'ansible_search_path' from source: unknown 30529 1726882671.90110: variable 'ansible_search_path' from source: unknown 30529 1726882671.90162: calling self._execute() 30529 1726882671.90334: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.90338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.90341: variable 'omit' from source: magic vars 30529 1726882671.90638: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.90649: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.90767: variable 'network_state' from source: role '' defaults 30529 1726882671.90776: Evaluated conditional (network_state != {}): False 30529 1726882671.90779: when evaluation is False, skipping this task 30529 1726882671.90782: _execute() done 30529 1726882671.90785: dumping result to json 30529 1726882671.90790: done dumping result, returning 30529 1726882671.90806: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001b45] 30529 1726882671.90809: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b45 30529 1726882671.90909: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b45 30529 1726882671.90912: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882671.91039: no more pending results, returning what we have 30529 1726882671.91042: results queue empty 30529 1726882671.91043: checking for any_errors_fatal 30529 1726882671.91048: done checking for any_errors_fatal 30529 1726882671.91049: checking for max_fail_percentage 30529 1726882671.91050: done checking for max_fail_percentage 30529 1726882671.91051: checking to see if all hosts have failed and the running result is not ok 30529 1726882671.91052: done checking to see if all hosts have failed 30529 1726882671.91052: getting the remaining hosts for this loop 30529 1726882671.91054: done getting the remaining hosts for this loop 30529 1726882671.91057: getting the next task for host managed_node1 30529 1726882671.91064: done getting next task for host managed_node1 30529 1726882671.91068: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882671.91073: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882671.91099: getting variables 30529 1726882671.91101: in VariableManager get_vars() 30529 1726882671.91134: Calling all_inventory to load vars for managed_node1 30529 1726882671.91136: Calling groups_inventory to load vars for managed_node1 30529 1726882671.91138: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882671.91146: Calling all_plugins_play to load vars for managed_node1 30529 1726882671.91149: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882671.91151: Calling groups_plugins_play to load vars for managed_node1 30529 1726882671.92548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882671.94394: done with get_vars() 30529 1726882671.94418: done getting variables 30529 1726882671.94484: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:51 -0400 (0:00:00.050) 0:01:25.971 ****** 30529 1726882671.94523: entering _queue_task() for managed_node1/service 30529 1726882671.95109: worker is 1 (out of 1 available) 30529 1726882671.95119: exiting _queue_task() for managed_node1/service 30529 1726882671.95130: done queuing things up, now waiting for results queue to drain 30529 1726882671.95132: waiting for pending results... 30529 1726882671.95512: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882671.95517: in run() - task 12673a56-9f93-b0f1-edc0-000000001b46 30529 1726882671.95521: variable 'ansible_search_path' from source: unknown 30529 1726882671.95523: variable 'ansible_search_path' from source: unknown 30529 1726882671.95526: calling self._execute() 30529 1726882671.95539: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882671.95545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882671.95555: variable 'omit' from source: magic vars 30529 1726882671.95956: variable 'ansible_distribution_major_version' from source: facts 30529 1726882671.95968: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882671.96098: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882671.96309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882671.98687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882671.98761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882671.98796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882671.98834: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882671.98870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882671.98949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.99003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.99029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.99076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.99094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.99137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.99158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.99194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.99228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.99298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.99301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882671.99308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882671.99332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882671.99367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882671.99380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882671.99550: variable 'network_connections' from source: include params 30529 1726882671.99561: variable 'interface' from source: play vars 30529 1726882671.99760: variable 'interface' from source: play vars 30529 1726882671.99763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882672.00160: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882672.00199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882672.00230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882672.00258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882672.00385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882672.00409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882672.00550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.00577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882672.00900: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882672.00916: variable 'network_connections' from source: include params 30529 1726882672.00922: variable 'interface' from source: play vars 30529 1726882672.01000: variable 'interface' from source: play vars 30529 1726882672.01015: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882672.01019: when evaluation is False, skipping this task 30529 1726882672.01021: _execute() done 30529 1726882672.01024: dumping result to json 30529 1726882672.01026: done dumping result, returning 30529 1726882672.01036: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001b46] 30529 1726882672.01040: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b46 30529 1726882672.01140: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b46 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882672.01522: no more pending results, returning what we have 30529 1726882672.01526: results queue empty 30529 1726882672.01527: checking for any_errors_fatal 30529 1726882672.01533: done checking for any_errors_fatal 30529 1726882672.01534: checking for max_fail_percentage 30529 1726882672.01535: done checking for max_fail_percentage 30529 1726882672.01536: checking to see if all hosts have failed and the running result is not ok 30529 1726882672.01537: done checking to see if all hosts have failed 30529 1726882672.01538: getting the remaining hosts for this loop 30529 1726882672.01539: done getting the remaining hosts for this loop 30529 1726882672.01543: getting the next task for host managed_node1 30529 1726882672.01551: done getting next task for host managed_node1 30529 1726882672.01555: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882672.01560: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882672.01583: getting variables 30529 1726882672.01585: in VariableManager get_vars() 30529 1726882672.01632: Calling all_inventory to load vars for managed_node1 30529 1726882672.01636: Calling groups_inventory to load vars for managed_node1 30529 1726882672.01638: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882672.01644: WORKER PROCESS EXITING 30529 1726882672.01698: Calling all_plugins_play to load vars for managed_node1 30529 1726882672.01702: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882672.01705: Calling groups_plugins_play to load vars for managed_node1 30529 1726882672.03238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882672.06807: done with get_vars() 30529 1726882672.06844: done getting variables 30529 1726882672.06961: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:52 -0400 (0:00:00.124) 0:01:26.096 ****** 30529 1726882672.07061: entering _queue_task() for managed_node1/service 30529 1726882672.07746: worker is 1 (out of 1 available) 30529 1726882672.07759: exiting _queue_task() for managed_node1/service 30529 1726882672.07771: done queuing things up, now waiting for results queue to drain 30529 1726882672.07773: waiting for pending results... 30529 1726882672.08310: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882672.08645: in run() - task 12673a56-9f93-b0f1-edc0-000000001b47 30529 1726882672.08659: variable 'ansible_search_path' from source: unknown 30529 1726882672.08662: variable 'ansible_search_path' from source: unknown 30529 1726882672.08701: calling self._execute() 30529 1726882672.08797: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.09025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.09028: variable 'omit' from source: magic vars 30529 1726882672.09594: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.09898: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882672.09973: variable 'network_provider' from source: set_fact 30529 1726882672.09976: variable 'network_state' from source: role '' defaults 30529 1726882672.09988: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882672.10101: variable 'omit' from source: magic vars 30529 1726882672.10154: variable 'omit' from source: magic vars 30529 1726882672.10221: variable 'network_service_name' from source: role '' defaults 30529 1726882672.10238: variable 'network_service_name' from source: role '' defaults 30529 1726882672.10636: variable '__network_provider_setup' from source: role '' defaults 30529 1726882672.10642: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882672.10708: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882672.10711: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882672.10800: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882672.11401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882672.15827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882672.15888: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882672.15999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882672.16003: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882672.16207: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882672.16280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.16313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.16337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.16374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.16388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.16539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.16560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.16700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.16703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.16705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.17165: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882672.17498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.17524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.17548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.17586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.17710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.17805: variable 'ansible_python' from source: facts 30529 1726882672.17824: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882672.18013: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882672.18088: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882672.18627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.18664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.18771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.18774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.18777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.18779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.19009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.19034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.19071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.19084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.19427: variable 'network_connections' from source: include params 30529 1726882672.19439: variable 'interface' from source: play vars 30529 1726882672.19532: variable 'interface' from source: play vars 30529 1726882672.19720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882672.20010: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882672.20075: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882672.20307: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882672.20345: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882672.21340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882672.21343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882672.21345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.21348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882672.21362: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882672.22064: variable 'network_connections' from source: include params 30529 1726882672.22067: variable 'interface' from source: play vars 30529 1726882672.22142: variable 'interface' from source: play vars 30529 1726882672.22172: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882672.22453: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882672.22926: variable 'network_connections' from source: include params 30529 1726882672.22929: variable 'interface' from source: play vars 30529 1726882672.22989: variable 'interface' from source: play vars 30529 1726882672.23417: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882672.23495: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882672.24003: variable 'network_connections' from source: include params 30529 1726882672.24013: variable 'interface' from source: play vars 30529 1726882672.24198: variable 'interface' from source: play vars 30529 1726882672.24248: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882672.24376: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882672.24436: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882672.24498: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882672.24990: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882672.25983: variable 'network_connections' from source: include params 30529 1726882672.26026: variable 'interface' from source: play vars 30529 1726882672.26169: variable 'interface' from source: play vars 30529 1726882672.26277: variable 'ansible_distribution' from source: facts 30529 1726882672.26281: variable '__network_rh_distros' from source: role '' defaults 30529 1726882672.26284: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.26286: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882672.26615: variable 'ansible_distribution' from source: facts 30529 1726882672.26675: variable '__network_rh_distros' from source: role '' defaults 30529 1726882672.26685: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.26710: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882672.27142: variable 'ansible_distribution' from source: facts 30529 1726882672.27146: variable '__network_rh_distros' from source: role '' defaults 30529 1726882672.27148: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.27162: variable 'network_provider' from source: set_fact 30529 1726882672.27233: variable 'omit' from source: magic vars 30529 1726882672.27360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882672.27373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882672.27398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882672.27443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882672.27481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882672.27561: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882672.27576: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.27686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.27900: Set connection var ansible_shell_executable to /bin/sh 30529 1726882672.27903: Set connection var ansible_pipelining to False 30529 1726882672.27905: Set connection var ansible_shell_type to sh 30529 1726882672.27907: Set connection var ansible_timeout to 10 30529 1726882672.27909: Set connection var ansible_connection to ssh 30529 1726882672.27911: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882672.27938: variable 'ansible_shell_executable' from source: unknown 30529 1726882672.28008: variable 'ansible_connection' from source: unknown 30529 1726882672.28011: variable 'ansible_module_compression' from source: unknown 30529 1726882672.28013: variable 'ansible_shell_type' from source: unknown 30529 1726882672.28016: variable 'ansible_shell_executable' from source: unknown 30529 1726882672.28018: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.28019: variable 'ansible_pipelining' from source: unknown 30529 1726882672.28021: variable 'ansible_timeout' from source: unknown 30529 1726882672.28023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.28252: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882672.28306: variable 'omit' from source: magic vars 30529 1726882672.28510: starting attempt loop 30529 1726882672.28513: running the handler 30529 1726882672.28516: variable 'ansible_facts' from source: unknown 30529 1726882672.29701: _low_level_execute_command(): starting 30529 1726882672.29712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882672.30436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882672.30455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882672.30479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882672.30503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882672.30523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882672.30600: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.30652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882672.30675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.30730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.32430: stdout chunk (state=3): >>>/root <<< 30529 1726882672.32550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882672.32562: stderr chunk (state=3): >>><<< 30529 1726882672.32616: stdout chunk (state=3): >>><<< 30529 1726882672.32692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882672.32698: _low_level_execute_command(): starting 30529 1726882672.32702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439 `" && echo ansible-tmp-1726882672.3264537-34607-109028464063439="` echo /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439 `" ) && sleep 0' 30529 1726882672.33314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.33379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882672.33399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882672.33441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.33483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.35355: stdout chunk (state=3): >>>ansible-tmp-1726882672.3264537-34607-109028464063439=/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439 <<< 30529 1726882672.35510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882672.35513: stdout chunk (state=3): >>><<< 30529 1726882672.35515: stderr chunk (state=3): >>><<< 30529 1726882672.35699: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882672.3264537-34607-109028464063439=/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882672.35702: variable 'ansible_module_compression' from source: unknown 30529 1726882672.35705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882672.35707: variable 'ansible_facts' from source: unknown 30529 1726882672.35937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py 30529 1726882672.36114: Sending initial data 30529 1726882672.36123: Sent initial data (156 bytes) 30529 1726882672.36738: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882672.36752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882672.36765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882672.36781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882672.36813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882672.36903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882672.36935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882672.36954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.37030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.38529: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882672.38603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882672.38689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpew_svzey /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py <<< 30529 1726882672.38692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py" <<< 30529 1726882672.38743: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpew_svzey" to remote "/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py" <<< 30529 1726882672.40097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882672.40267: stderr chunk (state=3): >>><<< 30529 1726882672.40270: stdout chunk (state=3): >>><<< 30529 1726882672.40272: done transferring module to remote 30529 1726882672.40274: _low_level_execute_command(): starting 30529 1726882672.40276: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/ /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py && sleep 0' 30529 1726882672.40827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882672.40835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882672.40850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882672.40862: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882672.40874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.40891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882672.40916: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882672.40940: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.41016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882672.41078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.41167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.42819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882672.42847: stderr chunk (state=3): >>><<< 30529 1726882672.42849: stdout chunk (state=3): >>><<< 30529 1726882672.42858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882672.42898: _low_level_execute_command(): starting 30529 1726882672.42902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/AnsiballZ_systemd.py && sleep 0' 30529 1726882672.43273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882672.43276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.43278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882672.43280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882672.43282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.43331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882672.43334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.43386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.71833: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10899456", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305496576", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1881202000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882672.71871: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882672.71878: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882672.73586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882672.73641: stderr chunk (state=3): >>><<< 30529 1726882672.73645: stdout chunk (state=3): >>><<< 30529 1726882672.73664: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10899456", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305496576", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1881202000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882672.73848: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882672.73864: _low_level_execute_command(): starting 30529 1726882672.73869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882672.3264537-34607-109028464063439/ > /dev/null 2>&1 && sleep 0' 30529 1726882672.74419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882672.74422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882672.74424: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.74426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882672.74428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882672.74430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882672.74485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882672.74488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882672.74490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882672.74537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882672.76325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882672.76364: stderr chunk (state=3): >>><<< 30529 1726882672.76368: stdout chunk (state=3): >>><<< 30529 1726882672.76399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882672.76402: handler run complete 30529 1726882672.76468: attempt loop complete, returning result 30529 1726882672.76471: _execute() done 30529 1726882672.76473: dumping result to json 30529 1726882672.76485: done dumping result, returning 30529 1726882672.76496: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000001b47] 30529 1726882672.76499: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b47 30529 1726882672.76734: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b47 30529 1726882672.76737: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882672.76829: no more pending results, returning what we have 30529 1726882672.76833: results queue empty 30529 1726882672.76834: checking for any_errors_fatal 30529 1726882672.76841: done checking for any_errors_fatal 30529 1726882672.76842: checking for max_fail_percentage 30529 1726882672.76843: done checking for max_fail_percentage 30529 1726882672.76844: checking to see if all hosts have failed and the running result is not ok 30529 1726882672.76845: done checking to see if all hosts have failed 30529 1726882672.76846: getting the remaining hosts for this loop 30529 1726882672.76847: done getting the remaining hosts for this loop 30529 1726882672.76855: getting the next task for host managed_node1 30529 1726882672.76863: done getting next task for host managed_node1 30529 1726882672.76870: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882672.76875: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882672.76892: getting variables 30529 1726882672.76896: in VariableManager get_vars() 30529 1726882672.76965: Calling all_inventory to load vars for managed_node1 30529 1726882672.76968: Calling groups_inventory to load vars for managed_node1 30529 1726882672.76971: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882672.76981: Calling all_plugins_play to load vars for managed_node1 30529 1726882672.76984: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882672.76986: Calling groups_plugins_play to load vars for managed_node1 30529 1726882672.78013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882672.78865: done with get_vars() 30529 1726882672.78881: done getting variables 30529 1726882672.78929: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:52 -0400 (0:00:00.719) 0:01:26.815 ****** 30529 1726882672.78956: entering _queue_task() for managed_node1/service 30529 1726882672.79203: worker is 1 (out of 1 available) 30529 1726882672.79216: exiting _queue_task() for managed_node1/service 30529 1726882672.79229: done queuing things up, now waiting for results queue to drain 30529 1726882672.79231: waiting for pending results... 30529 1726882672.79411: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882672.79496: in run() - task 12673a56-9f93-b0f1-edc0-000000001b48 30529 1726882672.79509: variable 'ansible_search_path' from source: unknown 30529 1726882672.79513: variable 'ansible_search_path' from source: unknown 30529 1726882672.79540: calling self._execute() 30529 1726882672.79619: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.79623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.79631: variable 'omit' from source: magic vars 30529 1726882672.79908: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.79917: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882672.79996: variable 'network_provider' from source: set_fact 30529 1726882672.79999: Evaluated conditional (network_provider == "nm"): True 30529 1726882672.80064: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882672.80129: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882672.80244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882672.81666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882672.81713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882672.81740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882672.81768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882672.81790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882672.81860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.81880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.81899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.81925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.81936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.81969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.81990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.82009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.82034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.82045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.82076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.82095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.82112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.82135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.82146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.82243: variable 'network_connections' from source: include params 30529 1726882672.82252: variable 'interface' from source: play vars 30529 1726882672.82298: variable 'interface' from source: play vars 30529 1726882672.82347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882672.82458: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882672.82484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882672.82510: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882672.82533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882672.82561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882672.82576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882672.82596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.82617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882672.82653: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882672.82803: variable 'network_connections' from source: include params 30529 1726882672.82807: variable 'interface' from source: play vars 30529 1726882672.82852: variable 'interface' from source: play vars 30529 1726882672.82872: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882672.82875: when evaluation is False, skipping this task 30529 1726882672.82878: _execute() done 30529 1726882672.82880: dumping result to json 30529 1726882672.82883: done dumping result, returning 30529 1726882672.82892: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000001b48] 30529 1726882672.82905: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b48 30529 1726882672.82983: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b48 30529 1726882672.82986: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882672.83038: no more pending results, returning what we have 30529 1726882672.83042: results queue empty 30529 1726882672.83043: checking for any_errors_fatal 30529 1726882672.83062: done checking for any_errors_fatal 30529 1726882672.83063: checking for max_fail_percentage 30529 1726882672.83065: done checking for max_fail_percentage 30529 1726882672.83065: checking to see if all hosts have failed and the running result is not ok 30529 1726882672.83066: done checking to see if all hosts have failed 30529 1726882672.83067: getting the remaining hosts for this loop 30529 1726882672.83069: done getting the remaining hosts for this loop 30529 1726882672.83072: getting the next task for host managed_node1 30529 1726882672.83080: done getting next task for host managed_node1 30529 1726882672.83084: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882672.83091: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882672.83115: getting variables 30529 1726882672.83117: in VariableManager get_vars() 30529 1726882672.83152: Calling all_inventory to load vars for managed_node1 30529 1726882672.83155: Calling groups_inventory to load vars for managed_node1 30529 1726882672.83157: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882672.83165: Calling all_plugins_play to load vars for managed_node1 30529 1726882672.83168: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882672.83170: Calling groups_plugins_play to load vars for managed_node1 30529 1726882672.83967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882672.84834: done with get_vars() 30529 1726882672.84850: done getting variables 30529 1726882672.84891: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:52 -0400 (0:00:00.059) 0:01:26.875 ****** 30529 1726882672.84916: entering _queue_task() for managed_node1/service 30529 1726882672.85138: worker is 1 (out of 1 available) 30529 1726882672.85150: exiting _queue_task() for managed_node1/service 30529 1726882672.85163: done queuing things up, now waiting for results queue to drain 30529 1726882672.85167: waiting for pending results... 30529 1726882672.85342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882672.85439: in run() - task 12673a56-9f93-b0f1-edc0-000000001b49 30529 1726882672.85451: variable 'ansible_search_path' from source: unknown 30529 1726882672.85454: variable 'ansible_search_path' from source: unknown 30529 1726882672.85483: calling self._execute() 30529 1726882672.85557: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.85561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.85570: variable 'omit' from source: magic vars 30529 1726882672.85835: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.85845: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882672.85923: variable 'network_provider' from source: set_fact 30529 1726882672.85926: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882672.85929: when evaluation is False, skipping this task 30529 1726882672.85932: _execute() done 30529 1726882672.85934: dumping result to json 30529 1726882672.85938: done dumping result, returning 30529 1726882672.85948: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000001b49] 30529 1726882672.85950: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b49 30529 1726882672.86040: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b49 30529 1726882672.86042: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882672.86092: no more pending results, returning what we have 30529 1726882672.86097: results queue empty 30529 1726882672.86098: checking for any_errors_fatal 30529 1726882672.86105: done checking for any_errors_fatal 30529 1726882672.86106: checking for max_fail_percentage 30529 1726882672.86108: done checking for max_fail_percentage 30529 1726882672.86109: checking to see if all hosts have failed and the running result is not ok 30529 1726882672.86109: done checking to see if all hosts have failed 30529 1726882672.86110: getting the remaining hosts for this loop 30529 1726882672.86112: done getting the remaining hosts for this loop 30529 1726882672.86115: getting the next task for host managed_node1 30529 1726882672.86123: done getting next task for host managed_node1 30529 1726882672.86127: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882672.86132: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882672.86153: getting variables 30529 1726882672.86154: in VariableManager get_vars() 30529 1726882672.86186: Calling all_inventory to load vars for managed_node1 30529 1726882672.86190: Calling groups_inventory to load vars for managed_node1 30529 1726882672.86192: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882672.86201: Calling all_plugins_play to load vars for managed_node1 30529 1726882672.86204: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882672.86206: Calling groups_plugins_play to load vars for managed_node1 30529 1726882672.87082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882672.87944: done with get_vars() 30529 1726882672.87959: done getting variables 30529 1726882672.88005: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:52 -0400 (0:00:00.031) 0:01:26.906 ****** 30529 1726882672.88030: entering _queue_task() for managed_node1/copy 30529 1726882672.88245: worker is 1 (out of 1 available) 30529 1726882672.88258: exiting _queue_task() for managed_node1/copy 30529 1726882672.88270: done queuing things up, now waiting for results queue to drain 30529 1726882672.88272: waiting for pending results... 30529 1726882672.88450: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882672.88555: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4a 30529 1726882672.88566: variable 'ansible_search_path' from source: unknown 30529 1726882672.88570: variable 'ansible_search_path' from source: unknown 30529 1726882672.88605: calling self._execute() 30529 1726882672.88679: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.88682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.88690: variable 'omit' from source: magic vars 30529 1726882672.88975: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.88984: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882672.89067: variable 'network_provider' from source: set_fact 30529 1726882672.89070: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882672.89073: when evaluation is False, skipping this task 30529 1726882672.89076: _execute() done 30529 1726882672.89078: dumping result to json 30529 1726882672.89081: done dumping result, returning 30529 1726882672.89090: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000001b4a] 30529 1726882672.89097: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4a 30529 1726882672.89181: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4a 30529 1726882672.89184: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882672.89238: no more pending results, returning what we have 30529 1726882672.89242: results queue empty 30529 1726882672.89243: checking for any_errors_fatal 30529 1726882672.89249: done checking for any_errors_fatal 30529 1726882672.89250: checking for max_fail_percentage 30529 1726882672.89252: done checking for max_fail_percentage 30529 1726882672.89253: checking to see if all hosts have failed and the running result is not ok 30529 1726882672.89254: done checking to see if all hosts have failed 30529 1726882672.89255: getting the remaining hosts for this loop 30529 1726882672.89256: done getting the remaining hosts for this loop 30529 1726882672.89260: getting the next task for host managed_node1 30529 1726882672.89267: done getting next task for host managed_node1 30529 1726882672.89270: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882672.89275: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882672.89299: getting variables 30529 1726882672.89301: in VariableManager get_vars() 30529 1726882672.89333: Calling all_inventory to load vars for managed_node1 30529 1726882672.89335: Calling groups_inventory to load vars for managed_node1 30529 1726882672.89337: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882672.89345: Calling all_plugins_play to load vars for managed_node1 30529 1726882672.89347: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882672.89349: Calling groups_plugins_play to load vars for managed_node1 30529 1726882672.90078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882672.91148: done with get_vars() 30529 1726882672.91163: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:52 -0400 (0:00:00.031) 0:01:26.938 ****** 30529 1726882672.91224: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882672.91447: worker is 1 (out of 1 available) 30529 1726882672.91459: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882672.91473: done queuing things up, now waiting for results queue to drain 30529 1726882672.91474: waiting for pending results... 30529 1726882672.91650: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882672.91751: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4b 30529 1726882672.91763: variable 'ansible_search_path' from source: unknown 30529 1726882672.91766: variable 'ansible_search_path' from source: unknown 30529 1726882672.91796: calling self._execute() 30529 1726882672.91870: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882672.91874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882672.91882: variable 'omit' from source: magic vars 30529 1726882672.92155: variable 'ansible_distribution_major_version' from source: facts 30529 1726882672.92165: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882672.92171: variable 'omit' from source: magic vars 30529 1726882672.92216: variable 'omit' from source: magic vars 30529 1726882672.92326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882672.99099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882672.99150: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882672.99176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882672.99204: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882672.99227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882672.99277: variable 'network_provider' from source: set_fact 30529 1726882672.99365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882672.99385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882672.99408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882672.99438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882672.99447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882672.99499: variable 'omit' from source: magic vars 30529 1726882672.99572: variable 'omit' from source: magic vars 30529 1726882672.99642: variable 'network_connections' from source: include params 30529 1726882672.99655: variable 'interface' from source: play vars 30529 1726882672.99697: variable 'interface' from source: play vars 30529 1726882672.99784: variable 'omit' from source: magic vars 30529 1726882672.99795: variable '__lsr_ansible_managed' from source: task vars 30529 1726882672.99835: variable '__lsr_ansible_managed' from source: task vars 30529 1726882673.00105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882673.00188: Loaded config def from plugin (lookup/template) 30529 1726882673.00312: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882673.00605: File lookup term: get_ansible_managed.j2 30529 1726882673.00609: variable 'ansible_search_path' from source: unknown 30529 1726882673.00612: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882673.00616: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882673.00619: variable 'ansible_search_path' from source: unknown 30529 1726882673.09114: variable 'ansible_managed' from source: unknown 30529 1726882673.09240: variable 'omit' from source: magic vars 30529 1726882673.09263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882673.09283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882673.09301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882673.09317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.09325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.09345: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882673.09348: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.09351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.09447: Set connection var ansible_shell_executable to /bin/sh 30529 1726882673.09450: Set connection var ansible_pipelining to False 30529 1726882673.09453: Set connection var ansible_shell_type to sh 30529 1726882673.09463: Set connection var ansible_timeout to 10 30529 1726882673.09466: Set connection var ansible_connection to ssh 30529 1726882673.09471: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882673.09497: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.09500: variable 'ansible_connection' from source: unknown 30529 1726882673.09504: variable 'ansible_module_compression' from source: unknown 30529 1726882673.09507: variable 'ansible_shell_type' from source: unknown 30529 1726882673.09510: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.09513: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.09515: variable 'ansible_pipelining' from source: unknown 30529 1726882673.09517: variable 'ansible_timeout' from source: unknown 30529 1726882673.09519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.09698: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882673.09709: variable 'omit' from source: magic vars 30529 1726882673.09712: starting attempt loop 30529 1726882673.09714: running the handler 30529 1726882673.09716: _low_level_execute_command(): starting 30529 1726882673.09718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882673.10399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882673.10420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.10432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.10514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.12154: stdout chunk (state=3): >>>/root <<< 30529 1726882673.12285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.12339: stderr chunk (state=3): >>><<< 30529 1726882673.12342: stdout chunk (state=3): >>><<< 30529 1726882673.12448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.12452: _low_level_execute_command(): starting 30529 1726882673.12456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348 `" && echo ansible-tmp-1726882673.1236143-34633-198233291876348="` echo /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348 `" ) && sleep 0' 30529 1726882673.13169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882673.13172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.13175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.13177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.13179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882673.13181: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882673.13183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.13185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882673.13187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882673.13189: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882673.13191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.13192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.13199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.13399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.13452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.15310: stdout chunk (state=3): >>>ansible-tmp-1726882673.1236143-34633-198233291876348=/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348 <<< 30529 1726882673.15463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.15467: stdout chunk (state=3): >>><<< 30529 1726882673.15470: stderr chunk (state=3): >>><<< 30529 1726882673.15704: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882673.1236143-34633-198233291876348=/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.15714: variable 'ansible_module_compression' from source: unknown 30529 1726882673.15718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882673.15720: variable 'ansible_facts' from source: unknown 30529 1726882673.15885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py 30529 1726882673.16204: Sending initial data 30529 1726882673.16213: Sent initial data (168 bytes) 30529 1726882673.17221: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.17241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.17309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.17328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.17352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.17510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.19234: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882673.19273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882673.19328: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpy_r4g6v3 /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py <<< 30529 1726882673.19334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py" <<< 30529 1726882673.19385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpy_r4g6v3" to remote "/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py" <<< 30529 1726882673.21063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.21129: stderr chunk (state=3): >>><<< 30529 1726882673.21142: stdout chunk (state=3): >>><<< 30529 1726882673.21202: done transferring module to remote 30529 1726882673.21222: _low_level_execute_command(): starting 30529 1726882673.21231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/ /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py && sleep 0' 30529 1726882673.21909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.21957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882673.21975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.22004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.22086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.23871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.23996: stdout chunk (state=3): >>><<< 30529 1726882673.24003: stderr chunk (state=3): >>><<< 30529 1726882673.24006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.24008: _low_level_execute_command(): starting 30529 1726882673.24011: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/AnsiballZ_network_connections.py && sleep 0' 30529 1726882673.25224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882673.25337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882673.25350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.25512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.25602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.52040: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tuvgvks0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tuvgvks0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/20e24cfc-e38f-4d09-8124-2176ed3997b7: error=unknown <<< 30529 1726882673.52160: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882673.53828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882673.53854: stderr chunk (state=3): >>><<< 30529 1726882673.53865: stdout chunk (state=3): >>><<< 30529 1726882673.53905: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tuvgvks0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tuvgvks0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/20e24cfc-e38f-4d09-8124-2176ed3997b7: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882673.53942: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882673.53977: _low_level_execute_command(): starting 30529 1726882673.53981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882673.1236143-34633-198233291876348/ > /dev/null 2>&1 && sleep 0' 30529 1726882673.54600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.54604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882673.54606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882673.54608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.54640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882673.54643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.54678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.54743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.56518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.56539: stderr chunk (state=3): >>><<< 30529 1726882673.56543: stdout chunk (state=3): >>><<< 30529 1726882673.56555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.56563: handler run complete 30529 1726882673.56579: attempt loop complete, returning result 30529 1726882673.56582: _execute() done 30529 1726882673.56584: dumping result to json 30529 1726882673.56595: done dumping result, returning 30529 1726882673.56600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000001b4b] 30529 1726882673.56606: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4b 30529 1726882673.56735: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4b 30529 1726882673.56738: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30529 1726882673.56867: no more pending results, returning what we have 30529 1726882673.56870: results queue empty 30529 1726882673.56871: checking for any_errors_fatal 30529 1726882673.56876: done checking for any_errors_fatal 30529 1726882673.56879: checking for max_fail_percentage 30529 1726882673.56881: done checking for max_fail_percentage 30529 1726882673.56882: checking to see if all hosts have failed and the running result is not ok 30529 1726882673.56882: done checking to see if all hosts have failed 30529 1726882673.56883: getting the remaining hosts for this loop 30529 1726882673.56885: done getting the remaining hosts for this loop 30529 1726882673.56890: getting the next task for host managed_node1 30529 1726882673.56898: done getting next task for host managed_node1 30529 1726882673.56902: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882673.56906: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882673.56919: getting variables 30529 1726882673.56920: in VariableManager get_vars() 30529 1726882673.56957: Calling all_inventory to load vars for managed_node1 30529 1726882673.56959: Calling groups_inventory to load vars for managed_node1 30529 1726882673.56961: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882673.56970: Calling all_plugins_play to load vars for managed_node1 30529 1726882673.56972: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882673.56974: Calling groups_plugins_play to load vars for managed_node1 30529 1726882673.62716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882673.63716: done with get_vars() 30529 1726882673.63731: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:53 -0400 (0:00:00.725) 0:01:27.663 ****** 30529 1726882673.63790: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882673.64087: worker is 1 (out of 1 available) 30529 1726882673.64102: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882673.64114: done queuing things up, now waiting for results queue to drain 30529 1726882673.64117: waiting for pending results... 30529 1726882673.64319: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882673.64416: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4c 30529 1726882673.64428: variable 'ansible_search_path' from source: unknown 30529 1726882673.64431: variable 'ansible_search_path' from source: unknown 30529 1726882673.64463: calling self._execute() 30529 1726882673.64543: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.64548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.64556: variable 'omit' from source: magic vars 30529 1726882673.64886: variable 'ansible_distribution_major_version' from source: facts 30529 1726882673.64927: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882673.65041: variable 'network_state' from source: role '' defaults 30529 1726882673.65044: Evaluated conditional (network_state != {}): False 30529 1726882673.65047: when evaluation is False, skipping this task 30529 1726882673.65049: _execute() done 30529 1726882673.65052: dumping result to json 30529 1726882673.65055: done dumping result, returning 30529 1726882673.65058: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000001b4c] 30529 1726882673.65063: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4c 30529 1726882673.65185: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4c 30529 1726882673.65188: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882673.65243: no more pending results, returning what we have 30529 1726882673.65247: results queue empty 30529 1726882673.65248: checking for any_errors_fatal 30529 1726882673.65262: done checking for any_errors_fatal 30529 1726882673.65263: checking for max_fail_percentage 30529 1726882673.65265: done checking for max_fail_percentage 30529 1726882673.65266: checking to see if all hosts have failed and the running result is not ok 30529 1726882673.65267: done checking to see if all hosts have failed 30529 1726882673.65267: getting the remaining hosts for this loop 30529 1726882673.65269: done getting the remaining hosts for this loop 30529 1726882673.65272: getting the next task for host managed_node1 30529 1726882673.65281: done getting next task for host managed_node1 30529 1726882673.65284: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882673.65289: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882673.65316: getting variables 30529 1726882673.65318: in VariableManager get_vars() 30529 1726882673.65353: Calling all_inventory to load vars for managed_node1 30529 1726882673.65355: Calling groups_inventory to load vars for managed_node1 30529 1726882673.65357: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882673.65367: Calling all_plugins_play to load vars for managed_node1 30529 1726882673.65369: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882673.65371: Calling groups_plugins_play to load vars for managed_node1 30529 1726882673.66500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882673.67831: done with get_vars() 30529 1726882673.67856: done getting variables 30529 1726882673.67918: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:53 -0400 (0:00:00.041) 0:01:27.705 ****** 30529 1726882673.67955: entering _queue_task() for managed_node1/debug 30529 1726882673.68433: worker is 1 (out of 1 available) 30529 1726882673.68447: exiting _queue_task() for managed_node1/debug 30529 1726882673.68464: done queuing things up, now waiting for results queue to drain 30529 1726882673.68465: waiting for pending results... 30529 1726882673.69031: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882673.69089: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4d 30529 1726882673.69148: variable 'ansible_search_path' from source: unknown 30529 1726882673.69153: variable 'ansible_search_path' from source: unknown 30529 1726882673.69178: calling self._execute() 30529 1726882673.69313: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.69374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.69378: variable 'omit' from source: magic vars 30529 1726882673.69835: variable 'ansible_distribution_major_version' from source: facts 30529 1726882673.69858: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882673.69861: variable 'omit' from source: magic vars 30529 1726882673.69943: variable 'omit' from source: magic vars 30529 1726882673.69959: variable 'omit' from source: magic vars 30529 1726882673.70005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882673.70036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882673.70061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882673.70071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.70081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.70140: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882673.70143: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.70147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.70210: Set connection var ansible_shell_executable to /bin/sh 30529 1726882673.70213: Set connection var ansible_pipelining to False 30529 1726882673.70215: Set connection var ansible_shell_type to sh 30529 1726882673.70223: Set connection var ansible_timeout to 10 30529 1726882673.70226: Set connection var ansible_connection to ssh 30529 1726882673.70230: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882673.70249: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.70252: variable 'ansible_connection' from source: unknown 30529 1726882673.70255: variable 'ansible_module_compression' from source: unknown 30529 1726882673.70258: variable 'ansible_shell_type' from source: unknown 30529 1726882673.70261: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.70263: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.70265: variable 'ansible_pipelining' from source: unknown 30529 1726882673.70267: variable 'ansible_timeout' from source: unknown 30529 1726882673.70272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.70373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882673.70384: variable 'omit' from source: magic vars 30529 1726882673.70389: starting attempt loop 30529 1726882673.70396: running the handler 30529 1726882673.70488: variable '__network_connections_result' from source: set_fact 30529 1726882673.70531: handler run complete 30529 1726882673.70543: attempt loop complete, returning result 30529 1726882673.70546: _execute() done 30529 1726882673.70549: dumping result to json 30529 1726882673.70551: done dumping result, returning 30529 1726882673.70559: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001b4d] 30529 1726882673.70563: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4d 30529 1726882673.70652: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4d 30529 1726882673.70655: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 30529 1726882673.70725: no more pending results, returning what we have 30529 1726882673.70728: results queue empty 30529 1726882673.70729: checking for any_errors_fatal 30529 1726882673.70739: done checking for any_errors_fatal 30529 1726882673.70740: checking for max_fail_percentage 30529 1726882673.70741: done checking for max_fail_percentage 30529 1726882673.70742: checking to see if all hosts have failed and the running result is not ok 30529 1726882673.70743: done checking to see if all hosts have failed 30529 1726882673.70744: getting the remaining hosts for this loop 30529 1726882673.70746: done getting the remaining hosts for this loop 30529 1726882673.70749: getting the next task for host managed_node1 30529 1726882673.70756: done getting next task for host managed_node1 30529 1726882673.70760: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882673.70764: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882673.70776: getting variables 30529 1726882673.70778: in VariableManager get_vars() 30529 1726882673.70816: Calling all_inventory to load vars for managed_node1 30529 1726882673.70819: Calling groups_inventory to load vars for managed_node1 30529 1726882673.70821: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882673.70830: Calling all_plugins_play to load vars for managed_node1 30529 1726882673.70833: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882673.70835: Calling groups_plugins_play to load vars for managed_node1 30529 1726882673.72198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882673.73814: done with get_vars() 30529 1726882673.73837: done getting variables 30529 1726882673.73909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:53 -0400 (0:00:00.059) 0:01:27.765 ****** 30529 1726882673.73951: entering _queue_task() for managed_node1/debug 30529 1726882673.74264: worker is 1 (out of 1 available) 30529 1726882673.74277: exiting _queue_task() for managed_node1/debug 30529 1726882673.74497: done queuing things up, now waiting for results queue to drain 30529 1726882673.74504: waiting for pending results... 30529 1726882673.74714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882673.74812: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4e 30529 1726882673.74816: variable 'ansible_search_path' from source: unknown 30529 1726882673.74819: variable 'ansible_search_path' from source: unknown 30529 1726882673.74848: calling self._execute() 30529 1726882673.74959: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.75027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.75031: variable 'omit' from source: magic vars 30529 1726882673.75410: variable 'ansible_distribution_major_version' from source: facts 30529 1726882673.75426: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882673.75437: variable 'omit' from source: magic vars 30529 1726882673.75511: variable 'omit' from source: magic vars 30529 1726882673.75551: variable 'omit' from source: magic vars 30529 1726882673.75598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882673.75639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882673.75663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882673.75691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.75785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.75788: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882673.75791: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.75794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.75869: Set connection var ansible_shell_executable to /bin/sh 30529 1726882673.75879: Set connection var ansible_pipelining to False 30529 1726882673.75889: Set connection var ansible_shell_type to sh 30529 1726882673.75909: Set connection var ansible_timeout to 10 30529 1726882673.75915: Set connection var ansible_connection to ssh 30529 1726882673.75924: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882673.75948: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.75955: variable 'ansible_connection' from source: unknown 30529 1726882673.75962: variable 'ansible_module_compression' from source: unknown 30529 1726882673.75967: variable 'ansible_shell_type' from source: unknown 30529 1726882673.75973: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.75979: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.75985: variable 'ansible_pipelining' from source: unknown 30529 1726882673.76005: variable 'ansible_timeout' from source: unknown 30529 1726882673.76007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.76139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882673.76226: variable 'omit' from source: magic vars 30529 1726882673.76230: starting attempt loop 30529 1726882673.76232: running the handler 30529 1726882673.76234: variable '__network_connections_result' from source: set_fact 30529 1726882673.76319: variable '__network_connections_result' from source: set_fact 30529 1726882673.76479: handler run complete 30529 1726882673.76512: attempt loop complete, returning result 30529 1726882673.76519: _execute() done 30529 1726882673.76525: dumping result to json 30529 1726882673.76532: done dumping result, returning 30529 1726882673.76548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001b4e] 30529 1726882673.76570: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4e ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30529 1726882673.76914: no more pending results, returning what we have 30529 1726882673.76918: results queue empty 30529 1726882673.76919: checking for any_errors_fatal 30529 1726882673.76927: done checking for any_errors_fatal 30529 1726882673.76927: checking for max_fail_percentage 30529 1726882673.76929: done checking for max_fail_percentage 30529 1726882673.76930: checking to see if all hosts have failed and the running result is not ok 30529 1726882673.76931: done checking to see if all hosts have failed 30529 1726882673.76932: getting the remaining hosts for this loop 30529 1726882673.76934: done getting the remaining hosts for this loop 30529 1726882673.76937: getting the next task for host managed_node1 30529 1726882673.76946: done getting next task for host managed_node1 30529 1726882673.76950: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882673.76955: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882673.76973: getting variables 30529 1726882673.76975: in VariableManager get_vars() 30529 1726882673.77220: Calling all_inventory to load vars for managed_node1 30529 1726882673.77224: Calling groups_inventory to load vars for managed_node1 30529 1726882673.77226: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882673.77236: Calling all_plugins_play to load vars for managed_node1 30529 1726882673.77239: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882673.77242: Calling groups_plugins_play to load vars for managed_node1 30529 1726882673.77850: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4e 30529 1726882673.77862: WORKER PROCESS EXITING 30529 1726882673.79008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882673.80836: done with get_vars() 30529 1726882673.80874: done getting variables 30529 1726882673.80935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:53 -0400 (0:00:00.070) 0:01:27.835 ****** 30529 1726882673.80984: entering _queue_task() for managed_node1/debug 30529 1726882673.81391: worker is 1 (out of 1 available) 30529 1726882673.81406: exiting _queue_task() for managed_node1/debug 30529 1726882673.81417: done queuing things up, now waiting for results queue to drain 30529 1726882673.81418: waiting for pending results... 30529 1726882673.81814: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882673.81878: in run() - task 12673a56-9f93-b0f1-edc0-000000001b4f 30529 1726882673.81905: variable 'ansible_search_path' from source: unknown 30529 1726882673.81916: variable 'ansible_search_path' from source: unknown 30529 1726882673.81958: calling self._execute() 30529 1726882673.82074: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.82086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.82103: variable 'omit' from source: magic vars 30529 1726882673.82536: variable 'ansible_distribution_major_version' from source: facts 30529 1726882673.82555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882673.82690: variable 'network_state' from source: role '' defaults 30529 1726882673.82710: Evaluated conditional (network_state != {}): False 30529 1726882673.82719: when evaluation is False, skipping this task 30529 1726882673.82727: _execute() done 30529 1726882673.82734: dumping result to json 30529 1726882673.82741: done dumping result, returning 30529 1726882673.82753: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000001b4f] 30529 1726882673.82763: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4f 30529 1726882673.82963: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b4f 30529 1726882673.82966: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882673.83065: no more pending results, returning what we have 30529 1726882673.83071: results queue empty 30529 1726882673.83072: checking for any_errors_fatal 30529 1726882673.83087: done checking for any_errors_fatal 30529 1726882673.83088: checking for max_fail_percentage 30529 1726882673.83092: done checking for max_fail_percentage 30529 1726882673.83095: checking to see if all hosts have failed and the running result is not ok 30529 1726882673.83096: done checking to see if all hosts have failed 30529 1726882673.83096: getting the remaining hosts for this loop 30529 1726882673.83098: done getting the remaining hosts for this loop 30529 1726882673.83103: getting the next task for host managed_node1 30529 1726882673.83115: done getting next task for host managed_node1 30529 1726882673.83119: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882673.83127: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882673.83162: getting variables 30529 1726882673.83165: in VariableManager get_vars() 30529 1726882673.83326: Calling all_inventory to load vars for managed_node1 30529 1726882673.83329: Calling groups_inventory to load vars for managed_node1 30529 1726882673.83361: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882673.83404: Calling all_plugins_play to load vars for managed_node1 30529 1726882673.83408: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882673.83412: Calling groups_plugins_play to load vars for managed_node1 30529 1726882673.85666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882673.88346: done with get_vars() 30529 1726882673.88369: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:53 -0400 (0:00:00.074) 0:01:27.910 ****** 30529 1726882673.88468: entering _queue_task() for managed_node1/ping 30529 1726882673.88807: worker is 1 (out of 1 available) 30529 1726882673.88820: exiting _queue_task() for managed_node1/ping 30529 1726882673.88835: done queuing things up, now waiting for results queue to drain 30529 1726882673.88837: waiting for pending results... 30529 1726882673.89113: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882673.89323: in run() - task 12673a56-9f93-b0f1-edc0-000000001b50 30529 1726882673.89329: variable 'ansible_search_path' from source: unknown 30529 1726882673.89332: variable 'ansible_search_path' from source: unknown 30529 1726882673.89345: calling self._execute() 30529 1726882673.89451: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.89476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.89540: variable 'omit' from source: magic vars 30529 1726882673.89943: variable 'ansible_distribution_major_version' from source: facts 30529 1726882673.89960: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882673.89973: variable 'omit' from source: magic vars 30529 1726882673.90062: variable 'omit' from source: magic vars 30529 1726882673.90127: variable 'omit' from source: magic vars 30529 1726882673.90163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882673.90236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882673.90239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882673.90259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.90276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882673.90327: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882673.90343: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.90451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.90548: Set connection var ansible_shell_executable to /bin/sh 30529 1726882673.90909: Set connection var ansible_pipelining to False 30529 1726882673.90913: Set connection var ansible_shell_type to sh 30529 1726882673.90915: Set connection var ansible_timeout to 10 30529 1726882673.90917: Set connection var ansible_connection to ssh 30529 1726882673.90919: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882673.90921: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.90923: variable 'ansible_connection' from source: unknown 30529 1726882673.90925: variable 'ansible_module_compression' from source: unknown 30529 1726882673.90927: variable 'ansible_shell_type' from source: unknown 30529 1726882673.90929: variable 'ansible_shell_executable' from source: unknown 30529 1726882673.90931: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882673.90933: variable 'ansible_pipelining' from source: unknown 30529 1726882673.90935: variable 'ansible_timeout' from source: unknown 30529 1726882673.90937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882673.91243: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882673.91304: variable 'omit' from source: magic vars 30529 1726882673.91315: starting attempt loop 30529 1726882673.91322: running the handler 30529 1726882673.91345: _low_level_execute_command(): starting 30529 1726882673.91359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882673.92832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.92850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.92882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.93117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.93143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.93226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.94857: stdout chunk (state=3): >>>/root <<< 30529 1726882673.94982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.94988: stdout chunk (state=3): >>><<< 30529 1726882673.95003: stderr chunk (state=3): >>><<< 30529 1726882673.95026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.95038: _low_level_execute_command(): starting 30529 1726882673.95045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710 `" && echo ansible-tmp-1726882673.9502442-34676-225701142886710="` echo /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710 `" ) && sleep 0' 30529 1726882673.96057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882673.96097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.96101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.96104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.96107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882673.96109: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882673.96121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.96153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882673.96156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882673.96158: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882673.96161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.96163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.96169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.96262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882673.96266: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882673.96268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.96271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882673.96273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.96291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.96372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882673.98222: stdout chunk (state=3): >>>ansible-tmp-1726882673.9502442-34676-225701142886710=/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710 <<< 30529 1726882673.98372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882673.98382: stdout chunk (state=3): >>><<< 30529 1726882673.98439: stderr chunk (state=3): >>><<< 30529 1726882673.98609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882673.9502442-34676-225701142886710=/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882673.98613: variable 'ansible_module_compression' from source: unknown 30529 1726882673.98615: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882673.98618: variable 'ansible_facts' from source: unknown 30529 1726882673.98769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py 30529 1726882673.98990: Sending initial data 30529 1726882673.99004: Sent initial data (153 bytes) 30529 1726882673.99590: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882673.99602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882673.99608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.99630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882673.99634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882673.99691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882673.99707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882673.99751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.01499: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882674.01508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882674.01554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmph94zf7fn /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py <<< 30529 1726882674.01558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmph94zf7fn" to remote "/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py" <<< 30529 1726882674.02942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882674.02945: stdout chunk (state=3): >>><<< 30529 1726882674.02947: stderr chunk (state=3): >>><<< 30529 1726882674.02956: done transferring module to remote 30529 1726882674.02973: _low_level_execute_command(): starting 30529 1726882674.02989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/ /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py && sleep 0' 30529 1726882674.03582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882674.03702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882674.03722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.03798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.05616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882674.05630: stderr chunk (state=3): >>><<< 30529 1726882674.05698: stdout chunk (state=3): >>><<< 30529 1726882674.05895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882674.05899: _low_level_execute_command(): starting 30529 1726882674.05902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/AnsiballZ_ping.py && sleep 0' 30529 1726882674.07461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882674.07581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.07823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882674.08103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.08123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.22796: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882674.24106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882674.24110: stdout chunk (state=3): >>><<< 30529 1726882674.24113: stderr chunk (state=3): >>><<< 30529 1726882674.24115: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882674.24120: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882674.24124: _low_level_execute_command(): starting 30529 1726882674.24135: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882673.9502442-34676-225701142886710/ > /dev/null 2>&1 && sleep 0' 30529 1726882674.24914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882674.24917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.24922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882674.24926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.25029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882674.25036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.25101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.26903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882674.26914: stderr chunk (state=3): >>><<< 30529 1726882674.26917: stdout chunk (state=3): >>><<< 30529 1726882674.26934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882674.26941: handler run complete 30529 1726882674.26962: attempt loop complete, returning result 30529 1726882674.26965: _execute() done 30529 1726882674.26969: dumping result to json 30529 1726882674.26971: done dumping result, returning 30529 1726882674.26974: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000001b50] 30529 1726882674.27010: sending task result for task 12673a56-9f93-b0f1-edc0-000000001b50 30529 1726882674.27087: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001b50 30529 1726882674.27089: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882674.27159: no more pending results, returning what we have 30529 1726882674.27163: results queue empty 30529 1726882674.27164: checking for any_errors_fatal 30529 1726882674.27175: done checking for any_errors_fatal 30529 1726882674.27176: checking for max_fail_percentage 30529 1726882674.27178: done checking for max_fail_percentage 30529 1726882674.27179: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.27182: done checking to see if all hosts have failed 30529 1726882674.27182: getting the remaining hosts for this loop 30529 1726882674.27184: done getting the remaining hosts for this loop 30529 1726882674.27187: getting the next task for host managed_node1 30529 1726882674.27200: done getting next task for host managed_node1 30529 1726882674.27204: ^ task is: TASK: meta (role_complete) 30529 1726882674.27210: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.27226: getting variables 30529 1726882674.27229: in VariableManager get_vars() 30529 1726882674.27281: Calling all_inventory to load vars for managed_node1 30529 1726882674.27284: Calling groups_inventory to load vars for managed_node1 30529 1726882674.27286: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.27327: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.27331: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.27335: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.28248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.29124: done with get_vars() 30529 1726882674.29142: done getting variables 30529 1726882674.29201: done queuing things up, now waiting for results queue to drain 30529 1726882674.29202: results queue empty 30529 1726882674.29203: checking for any_errors_fatal 30529 1726882674.29205: done checking for any_errors_fatal 30529 1726882674.29205: checking for max_fail_percentage 30529 1726882674.29206: done checking for max_fail_percentage 30529 1726882674.29206: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.29206: done checking to see if all hosts have failed 30529 1726882674.29207: getting the remaining hosts for this loop 30529 1726882674.29207: done getting the remaining hosts for this loop 30529 1726882674.29209: getting the next task for host managed_node1 30529 1726882674.29212: done getting next task for host managed_node1 30529 1726882674.29214: ^ task is: TASK: Test 30529 1726882674.29216: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.29218: getting variables 30529 1726882674.29218: in VariableManager get_vars() 30529 1726882674.29225: Calling all_inventory to load vars for managed_node1 30529 1726882674.29227: Calling groups_inventory to load vars for managed_node1 30529 1726882674.29228: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.29231: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.29233: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.29234: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.29980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.31600: done with get_vars() 30529 1726882674.31622: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:37:54 -0400 (0:00:00.432) 0:01:28.342 ****** 30529 1726882674.31676: entering _queue_task() for managed_node1/include_tasks 30529 1726882674.32042: worker is 1 (out of 1 available) 30529 1726882674.32056: exiting _queue_task() for managed_node1/include_tasks 30529 1726882674.32070: done queuing things up, now waiting for results queue to drain 30529 1726882674.32071: waiting for pending results... 30529 1726882674.32347: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882674.32432: in run() - task 12673a56-9f93-b0f1-edc0-000000001748 30529 1726882674.32445: variable 'ansible_search_path' from source: unknown 30529 1726882674.32449: variable 'ansible_search_path' from source: unknown 30529 1726882674.32514: variable 'lsr_test' from source: include params 30529 1726882674.32708: variable 'lsr_test' from source: include params 30529 1726882674.32763: variable 'omit' from source: magic vars 30529 1726882674.32885: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.32896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.32907: variable 'omit' from source: magic vars 30529 1726882674.33077: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.33085: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.33095: variable 'item' from source: unknown 30529 1726882674.33142: variable 'item' from source: unknown 30529 1726882674.33164: variable 'item' from source: unknown 30529 1726882674.33212: variable 'item' from source: unknown 30529 1726882674.33341: dumping result to json 30529 1726882674.33344: done dumping result, returning 30529 1726882674.33346: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-000000001748] 30529 1726882674.33347: sending task result for task 12673a56-9f93-b0f1-edc0-000000001748 30529 1726882674.33381: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001748 30529 1726882674.33383: WORKER PROCESS EXITING 30529 1726882674.33404: no more pending results, returning what we have 30529 1726882674.33409: in VariableManager get_vars() 30529 1726882674.33451: Calling all_inventory to load vars for managed_node1 30529 1726882674.33453: Calling groups_inventory to load vars for managed_node1 30529 1726882674.33456: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.33468: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.33471: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.33473: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.34429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.35339: done with get_vars() 30529 1726882674.35352: variable 'ansible_search_path' from source: unknown 30529 1726882674.35353: variable 'ansible_search_path' from source: unknown 30529 1726882674.35380: we have included files to process 30529 1726882674.35381: generating all_blocks data 30529 1726882674.35383: done generating all_blocks data 30529 1726882674.35390: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882674.35392: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882674.35395: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882674.35538: done processing included file 30529 1726882674.35540: iterating over new_blocks loaded from include file 30529 1726882674.35541: in VariableManager get_vars() 30529 1726882674.35562: done with get_vars() 30529 1726882674.35565: filtering new block on tags 30529 1726882674.35596: done filtering new block on tags 30529 1726882674.35599: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node1 => (item=tasks/remove+down_profile.yml) 30529 1726882674.35604: extending task lists for all hosts with included blocks 30529 1726882674.36292: done extending task lists 30529 1726882674.36295: done processing included files 30529 1726882674.36295: results queue empty 30529 1726882674.36296: checking for any_errors_fatal 30529 1726882674.36297: done checking for any_errors_fatal 30529 1726882674.36297: checking for max_fail_percentage 30529 1726882674.36298: done checking for max_fail_percentage 30529 1726882674.36298: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.36299: done checking to see if all hosts have failed 30529 1726882674.36299: getting the remaining hosts for this loop 30529 1726882674.36300: done getting the remaining hosts for this loop 30529 1726882674.36302: getting the next task for host managed_node1 30529 1726882674.36305: done getting next task for host managed_node1 30529 1726882674.36306: ^ task is: TASK: Include network role 30529 1726882674.36308: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.36310: getting variables 30529 1726882674.36310: in VariableManager get_vars() 30529 1726882674.36318: Calling all_inventory to load vars for managed_node1 30529 1726882674.36319: Calling groups_inventory to load vars for managed_node1 30529 1726882674.36321: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.36325: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.36326: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.36328: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.37308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.38410: done with get_vars() 30529 1726882674.38425: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:37:54 -0400 (0:00:00.068) 0:01:28.410 ****** 30529 1726882674.38485: entering _queue_task() for managed_node1/include_role 30529 1726882674.38747: worker is 1 (out of 1 available) 30529 1726882674.38760: exiting _queue_task() for managed_node1/include_role 30529 1726882674.38773: done queuing things up, now waiting for results queue to drain 30529 1726882674.38775: waiting for pending results... 30529 1726882674.39018: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882674.39202: in run() - task 12673a56-9f93-b0f1-edc0-000000001ca9 30529 1726882674.39208: variable 'ansible_search_path' from source: unknown 30529 1726882674.39215: variable 'ansible_search_path' from source: unknown 30529 1726882674.39219: calling self._execute() 30529 1726882674.39333: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.39353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.39380: variable 'omit' from source: magic vars 30529 1726882674.39800: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.39832: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.39852: _execute() done 30529 1726882674.39867: dumping result to json 30529 1726882674.39880: done dumping result, returning 30529 1726882674.39905: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000001ca9] 30529 1726882674.39931: sending task result for task 12673a56-9f93-b0f1-edc0-000000001ca9 30529 1726882674.40028: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001ca9 30529 1726882674.40030: WORKER PROCESS EXITING 30529 1726882674.40087: no more pending results, returning what we have 30529 1726882674.40091: in VariableManager get_vars() 30529 1726882674.40134: Calling all_inventory to load vars for managed_node1 30529 1726882674.40137: Calling groups_inventory to load vars for managed_node1 30529 1726882674.40141: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.40151: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.40154: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.40156: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.41669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.42569: done with get_vars() 30529 1726882674.42583: variable 'ansible_search_path' from source: unknown 30529 1726882674.42584: variable 'ansible_search_path' from source: unknown 30529 1726882674.42670: variable 'omit' from source: magic vars 30529 1726882674.42700: variable 'omit' from source: magic vars 30529 1726882674.42709: variable 'omit' from source: magic vars 30529 1726882674.42712: we have included files to process 30529 1726882674.42712: generating all_blocks data 30529 1726882674.42714: done generating all_blocks data 30529 1726882674.42714: processing included file: fedora.linux_system_roles.network 30529 1726882674.42727: in VariableManager get_vars() 30529 1726882674.42737: done with get_vars() 30529 1726882674.42756: in VariableManager get_vars() 30529 1726882674.42769: done with get_vars() 30529 1726882674.42800: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882674.42873: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882674.42925: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882674.43271: in VariableManager get_vars() 30529 1726882674.43284: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882674.45038: iterating over new_blocks loaded from include file 30529 1726882674.45040: in VariableManager get_vars() 30529 1726882674.45058: done with get_vars() 30529 1726882674.45059: filtering new block on tags 30529 1726882674.45504: done filtering new block on tags 30529 1726882674.45508: in VariableManager get_vars() 30529 1726882674.45526: done with get_vars() 30529 1726882674.45527: filtering new block on tags 30529 1726882674.45544: done filtering new block on tags 30529 1726882674.45546: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882674.45552: extending task lists for all hosts with included blocks 30529 1726882674.45688: done extending task lists 30529 1726882674.45689: done processing included files 30529 1726882674.45690: results queue empty 30529 1726882674.45691: checking for any_errors_fatal 30529 1726882674.45697: done checking for any_errors_fatal 30529 1726882674.45698: checking for max_fail_percentage 30529 1726882674.45699: done checking for max_fail_percentage 30529 1726882674.45700: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.45701: done checking to see if all hosts have failed 30529 1726882674.45701: getting the remaining hosts for this loop 30529 1726882674.45703: done getting the remaining hosts for this loop 30529 1726882674.45705: getting the next task for host managed_node1 30529 1726882674.45717: done getting next task for host managed_node1 30529 1726882674.45720: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882674.45723: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.45734: getting variables 30529 1726882674.45735: in VariableManager get_vars() 30529 1726882674.45749: Calling all_inventory to load vars for managed_node1 30529 1726882674.45751: Calling groups_inventory to load vars for managed_node1 30529 1726882674.45753: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.45759: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.45761: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.45764: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.47249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.49186: done with get_vars() 30529 1726882674.49220: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:54 -0400 (0:00:00.108) 0:01:28.519 ****** 30529 1726882674.49328: entering _queue_task() for managed_node1/include_tasks 30529 1726882674.49910: worker is 1 (out of 1 available) 30529 1726882674.49923: exiting _queue_task() for managed_node1/include_tasks 30529 1726882674.49935: done queuing things up, now waiting for results queue to drain 30529 1726882674.49937: waiting for pending results... 30529 1726882674.50313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882674.50374: in run() - task 12673a56-9f93-b0f1-edc0-000000001d2b 30529 1726882674.50388: variable 'ansible_search_path' from source: unknown 30529 1726882674.50392: variable 'ansible_search_path' from source: unknown 30529 1726882674.50543: calling self._execute() 30529 1726882674.50547: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.50556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.50565: variable 'omit' from source: magic vars 30529 1726882674.50966: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.50986: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.50990: _execute() done 30529 1726882674.50997: dumping result to json 30529 1726882674.50999: done dumping result, returning 30529 1726882674.51010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000001d2b] 30529 1726882674.51014: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2b 30529 1726882674.51218: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2b 30529 1726882674.51222: WORKER PROCESS EXITING 30529 1726882674.51272: no more pending results, returning what we have 30529 1726882674.51277: in VariableManager get_vars() 30529 1726882674.51333: Calling all_inventory to load vars for managed_node1 30529 1726882674.51336: Calling groups_inventory to load vars for managed_node1 30529 1726882674.51339: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.51351: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.51355: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.51358: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.52965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.54517: done with get_vars() 30529 1726882674.54536: variable 'ansible_search_path' from source: unknown 30529 1726882674.54537: variable 'ansible_search_path' from source: unknown 30529 1726882674.54575: we have included files to process 30529 1726882674.54576: generating all_blocks data 30529 1726882674.54578: done generating all_blocks data 30529 1726882674.54580: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882674.54581: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882674.54584: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882674.55143: done processing included file 30529 1726882674.55145: iterating over new_blocks loaded from include file 30529 1726882674.55147: in VariableManager get_vars() 30529 1726882674.55174: done with get_vars() 30529 1726882674.55175: filtering new block on tags 30529 1726882674.55210: done filtering new block on tags 30529 1726882674.55213: in VariableManager get_vars() 30529 1726882674.55234: done with get_vars() 30529 1726882674.55236: filtering new block on tags 30529 1726882674.55277: done filtering new block on tags 30529 1726882674.55280: in VariableManager get_vars() 30529 1726882674.55307: done with get_vars() 30529 1726882674.55309: filtering new block on tags 30529 1726882674.55349: done filtering new block on tags 30529 1726882674.55351: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882674.55356: extending task lists for all hosts with included blocks 30529 1726882674.57045: done extending task lists 30529 1726882674.57047: done processing included files 30529 1726882674.57048: results queue empty 30529 1726882674.57048: checking for any_errors_fatal 30529 1726882674.57051: done checking for any_errors_fatal 30529 1726882674.57052: checking for max_fail_percentage 30529 1726882674.57053: done checking for max_fail_percentage 30529 1726882674.57054: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.57055: done checking to see if all hosts have failed 30529 1726882674.57055: getting the remaining hosts for this loop 30529 1726882674.57057: done getting the remaining hosts for this loop 30529 1726882674.57059: getting the next task for host managed_node1 30529 1726882674.57065: done getting next task for host managed_node1 30529 1726882674.57067: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882674.57071: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.57082: getting variables 30529 1726882674.57083: in VariableManager get_vars() 30529 1726882674.57102: Calling all_inventory to load vars for managed_node1 30529 1726882674.57105: Calling groups_inventory to load vars for managed_node1 30529 1726882674.57107: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.57112: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.57115: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.57118: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.58261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.60071: done with get_vars() 30529 1726882674.60099: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:54 -0400 (0:00:00.108) 0:01:28.627 ****** 30529 1726882674.60181: entering _queue_task() for managed_node1/setup 30529 1726882674.60582: worker is 1 (out of 1 available) 30529 1726882674.60598: exiting _queue_task() for managed_node1/setup 30529 1726882674.60611: done queuing things up, now waiting for results queue to drain 30529 1726882674.60613: waiting for pending results... 30529 1726882674.61388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882674.61799: in run() - task 12673a56-9f93-b0f1-edc0-000000001d82 30529 1726882674.61803: variable 'ansible_search_path' from source: unknown 30529 1726882674.61806: variable 'ansible_search_path' from source: unknown 30529 1726882674.61809: calling self._execute() 30529 1726882674.61927: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.61931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.61946: variable 'omit' from source: magic vars 30529 1726882674.62738: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.62750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.63207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882674.66803: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882674.66870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882674.66912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882674.66949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882674.66976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882674.67229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882674.67233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882674.67235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882674.67237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882674.67240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882674.67258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882674.67288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882674.67351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882674.67398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882674.67413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882674.67621: variable '__network_required_facts' from source: role '' defaults 30529 1726882674.67627: variable 'ansible_facts' from source: unknown 30529 1726882674.68456: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882674.68466: when evaluation is False, skipping this task 30529 1726882674.68469: _execute() done 30529 1726882674.68471: dumping result to json 30529 1726882674.68476: done dumping result, returning 30529 1726882674.68489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000001d82] 30529 1726882674.68491: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d82 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882674.68640: no more pending results, returning what we have 30529 1726882674.68645: results queue empty 30529 1726882674.68646: checking for any_errors_fatal 30529 1726882674.68649: done checking for any_errors_fatal 30529 1726882674.68649: checking for max_fail_percentage 30529 1726882674.68651: done checking for max_fail_percentage 30529 1726882674.68652: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.68653: done checking to see if all hosts have failed 30529 1726882674.68654: getting the remaining hosts for this loop 30529 1726882674.68656: done getting the remaining hosts for this loop 30529 1726882674.68660: getting the next task for host managed_node1 30529 1726882674.68674: done getting next task for host managed_node1 30529 1726882674.68678: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882674.68684: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.68718: getting variables 30529 1726882674.68720: in VariableManager get_vars() 30529 1726882674.68768: Calling all_inventory to load vars for managed_node1 30529 1726882674.68772: Calling groups_inventory to load vars for managed_node1 30529 1726882674.68774: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.68791: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.68797: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.68801: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.69329: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d82 30529 1726882674.69339: WORKER PROCESS EXITING 30529 1726882674.70669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.73747: done with get_vars() 30529 1726882674.73782: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:54 -0400 (0:00:00.137) 0:01:28.765 ****** 30529 1726882674.73940: entering _queue_task() for managed_node1/stat 30529 1726882674.74326: worker is 1 (out of 1 available) 30529 1726882674.74340: exiting _queue_task() for managed_node1/stat 30529 1726882674.74353: done queuing things up, now waiting for results queue to drain 30529 1726882674.74355: waiting for pending results... 30529 1726882674.74670: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882674.74826: in run() - task 12673a56-9f93-b0f1-edc0-000000001d84 30529 1726882674.74999: variable 'ansible_search_path' from source: unknown 30529 1726882674.75005: variable 'ansible_search_path' from source: unknown 30529 1726882674.75008: calling self._execute() 30529 1726882674.75011: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.75013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.75016: variable 'omit' from source: magic vars 30529 1726882674.75398: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.75402: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.75598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882674.75832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882674.75873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882674.76201: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882674.76204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882674.76207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882674.76209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882674.76212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882674.76214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882674.76217: variable '__network_is_ostree' from source: set_fact 30529 1726882674.76219: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882674.76222: when evaluation is False, skipping this task 30529 1726882674.76224: _execute() done 30529 1726882674.76228: dumping result to json 30529 1726882674.76231: done dumping result, returning 30529 1726882674.76308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000001d84] 30529 1726882674.76311: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d84 30529 1726882674.76378: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d84 30529 1726882674.76381: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882674.76635: no more pending results, returning what we have 30529 1726882674.76639: results queue empty 30529 1726882674.76640: checking for any_errors_fatal 30529 1726882674.76646: done checking for any_errors_fatal 30529 1726882674.76646: checking for max_fail_percentage 30529 1726882674.76648: done checking for max_fail_percentage 30529 1726882674.76649: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.76650: done checking to see if all hosts have failed 30529 1726882674.76650: getting the remaining hosts for this loop 30529 1726882674.76652: done getting the remaining hosts for this loop 30529 1726882674.76655: getting the next task for host managed_node1 30529 1726882674.76662: done getting next task for host managed_node1 30529 1726882674.76666: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882674.76671: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.76694: getting variables 30529 1726882674.76696: in VariableManager get_vars() 30529 1726882674.76733: Calling all_inventory to load vars for managed_node1 30529 1726882674.76735: Calling groups_inventory to load vars for managed_node1 30529 1726882674.76738: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.76746: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.76749: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.76752: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.78310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.80039: done with get_vars() 30529 1726882674.80063: done getting variables 30529 1726882674.80124: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:54 -0400 (0:00:00.062) 0:01:28.827 ****** 30529 1726882674.80163: entering _queue_task() for managed_node1/set_fact 30529 1726882674.80964: worker is 1 (out of 1 available) 30529 1726882674.80977: exiting _queue_task() for managed_node1/set_fact 30529 1726882674.81191: done queuing things up, now waiting for results queue to drain 30529 1726882674.81197: waiting for pending results... 30529 1726882674.81738: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882674.82184: in run() - task 12673a56-9f93-b0f1-edc0-000000001d85 30529 1726882674.82189: variable 'ansible_search_path' from source: unknown 30529 1726882674.82192: variable 'ansible_search_path' from source: unknown 30529 1726882674.82228: calling self._execute() 30529 1726882674.82499: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.82516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.82531: variable 'omit' from source: magic vars 30529 1726882674.83116: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.83128: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.83303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882674.83577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882674.83630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882674.83662: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882674.83700: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882674.83782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882674.83815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882674.83839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882674.83863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882674.83956: variable '__network_is_ostree' from source: set_fact 30529 1726882674.83962: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882674.83965: when evaluation is False, skipping this task 30529 1726882674.83968: _execute() done 30529 1726882674.83970: dumping result to json 30529 1726882674.83975: done dumping result, returning 30529 1726882674.83983: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000001d85] 30529 1726882674.83986: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d85 30529 1726882674.84085: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d85 30529 1726882674.84089: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882674.84139: no more pending results, returning what we have 30529 1726882674.84143: results queue empty 30529 1726882674.84144: checking for any_errors_fatal 30529 1726882674.84154: done checking for any_errors_fatal 30529 1726882674.84154: checking for max_fail_percentage 30529 1726882674.84157: done checking for max_fail_percentage 30529 1726882674.84158: checking to see if all hosts have failed and the running result is not ok 30529 1726882674.84159: done checking to see if all hosts have failed 30529 1726882674.84159: getting the remaining hosts for this loop 30529 1726882674.84161: done getting the remaining hosts for this loop 30529 1726882674.84165: getting the next task for host managed_node1 30529 1726882674.84178: done getting next task for host managed_node1 30529 1726882674.84181: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882674.84190: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882674.84217: getting variables 30529 1726882674.84219: in VariableManager get_vars() 30529 1726882674.84263: Calling all_inventory to load vars for managed_node1 30529 1726882674.84266: Calling groups_inventory to load vars for managed_node1 30529 1726882674.84268: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882674.84281: Calling all_plugins_play to load vars for managed_node1 30529 1726882674.84284: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882674.84289: Calling groups_plugins_play to load vars for managed_node1 30529 1726882674.86324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882674.87940: done with get_vars() 30529 1726882674.87964: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:54 -0400 (0:00:00.079) 0:01:28.906 ****** 30529 1726882674.88069: entering _queue_task() for managed_node1/service_facts 30529 1726882674.88763: worker is 1 (out of 1 available) 30529 1726882674.88775: exiting _queue_task() for managed_node1/service_facts 30529 1726882674.88789: done queuing things up, now waiting for results queue to drain 30529 1726882674.88791: waiting for pending results... 30529 1726882674.89399: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882674.89601: in run() - task 12673a56-9f93-b0f1-edc0-000000001d87 30529 1726882674.89617: variable 'ansible_search_path' from source: unknown 30529 1726882674.89626: variable 'ansible_search_path' from source: unknown 30529 1726882674.89668: calling self._execute() 30529 1726882674.89844: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.89848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.89859: variable 'omit' from source: magic vars 30529 1726882674.90254: variable 'ansible_distribution_major_version' from source: facts 30529 1726882674.90267: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882674.90273: variable 'omit' from source: magic vars 30529 1726882674.90360: variable 'omit' from source: magic vars 30529 1726882674.90398: variable 'omit' from source: magic vars 30529 1726882674.90439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882674.90473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882674.90496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882674.90514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882674.90531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882674.90563: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882674.90566: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.90569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.90675: Set connection var ansible_shell_executable to /bin/sh 30529 1726882674.90682: Set connection var ansible_pipelining to False 30529 1726882674.90685: Set connection var ansible_shell_type to sh 30529 1726882674.90698: Set connection var ansible_timeout to 10 30529 1726882674.90700: Set connection var ansible_connection to ssh 30529 1726882674.90800: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882674.90804: variable 'ansible_shell_executable' from source: unknown 30529 1726882674.90806: variable 'ansible_connection' from source: unknown 30529 1726882674.90809: variable 'ansible_module_compression' from source: unknown 30529 1726882674.90811: variable 'ansible_shell_type' from source: unknown 30529 1726882674.90812: variable 'ansible_shell_executable' from source: unknown 30529 1726882674.90815: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882674.90817: variable 'ansible_pipelining' from source: unknown 30529 1726882674.90819: variable 'ansible_timeout' from source: unknown 30529 1726882674.90821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882674.90955: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882674.90971: variable 'omit' from source: magic vars 30529 1726882674.90975: starting attempt loop 30529 1726882674.90978: running the handler 30529 1726882674.91001: _low_level_execute_command(): starting 30529 1726882674.91009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882674.92141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.92146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882674.92148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882674.92423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.92476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.94283: stdout chunk (state=3): >>>/root <<< 30529 1726882674.94287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882674.94290: stderr chunk (state=3): >>><<< 30529 1726882674.94299: stdout chunk (state=3): >>><<< 30529 1726882674.94329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882674.94342: _low_level_execute_command(): starting 30529 1726882674.94349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452 `" && echo ansible-tmp-1726882674.9432857-34726-251673203211452="` echo /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452 `" ) && sleep 0' 30529 1726882674.95344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882674.95350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882674.95452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882674.95456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882674.95458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882674.95460: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882674.95469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.95472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882674.95474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882674.95478: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.95518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882674.95528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882674.95541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.95618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882674.97657: stdout chunk (state=3): >>>ansible-tmp-1726882674.9432857-34726-251673203211452=/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452 <<< 30529 1726882674.97660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882674.97663: stdout chunk (state=3): >>><<< 30529 1726882674.97665: stderr chunk (state=3): >>><<< 30529 1726882674.97716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882674.9432857-34726-251673203211452=/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882674.97800: variable 'ansible_module_compression' from source: unknown 30529 1726882674.97984: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882674.97986: variable 'ansible_facts' from source: unknown 30529 1726882674.98066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py 30529 1726882674.98289: Sending initial data 30529 1726882674.98292: Sent initial data (162 bytes) 30529 1726882674.98904: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882674.98920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882674.98966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882674.99075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.99106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882674.99192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882674.99216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882674.99313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882675.00827: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882675.00839: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30529 1726882675.00850: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30529 1726882675.00866: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882675.00927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882675.00994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpo6kr41wz /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py <<< 30529 1726882675.01011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py" <<< 30529 1726882675.01042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpo6kr41wz" to remote "/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py" <<< 30529 1726882675.01929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882675.01933: stdout chunk (state=3): >>><<< 30529 1726882675.01935: stderr chunk (state=3): >>><<< 30529 1726882675.01945: done transferring module to remote 30529 1726882675.01962: _low_level_execute_command(): starting 30529 1726882675.01972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/ /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py && sleep 0' 30529 1726882675.02737: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882675.02755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882675.02772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882675.02815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882675.02923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882675.02946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882675.02971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882675.03005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882675.03089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882675.04844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882675.04848: stdout chunk (state=3): >>><<< 30529 1726882675.04854: stderr chunk (state=3): >>><<< 30529 1726882675.04870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882675.04873: _low_level_execute_command(): starting 30529 1726882675.04879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/AnsiballZ_service_facts.py && sleep 0' 30529 1726882675.05699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882675.05703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882675.05705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882675.05708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882675.05710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882675.05712: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882675.05714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882675.05716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882675.05718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882675.05720: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882675.05722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882675.05725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882675.05727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882675.05729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882675.05731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882675.05789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.57254: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882676.58901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882676.58906: stdout chunk (state=3): >>><<< 30529 1726882676.58908: stderr chunk (state=3): >>><<< 30529 1726882676.58912: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882676.60858: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882676.60878: _low_level_execute_command(): starting 30529 1726882676.60892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882674.9432857-34726-251673203211452/ > /dev/null 2>&1 && sleep 0' 30529 1726882676.61818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882676.61910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882676.61932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.62010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.63964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882676.63975: stdout chunk (state=3): >>><<< 30529 1726882676.63991: stderr chunk (state=3): >>><<< 30529 1726882676.64012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882676.64200: handler run complete 30529 1726882676.64500: variable 'ansible_facts' from source: unknown 30529 1726882676.64752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882676.65800: variable 'ansible_facts' from source: unknown 30529 1726882676.65937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882676.66181: attempt loop complete, returning result 30529 1726882676.66198: _execute() done 30529 1726882676.66206: dumping result to json 30529 1726882676.66270: done dumping result, returning 30529 1726882676.66284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000001d87] 30529 1726882676.66300: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d87 30529 1726882676.67767: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d87 30529 1726882676.67770: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882676.67885: no more pending results, returning what we have 30529 1726882676.67890: results queue empty 30529 1726882676.67891: checking for any_errors_fatal 30529 1726882676.67895: done checking for any_errors_fatal 30529 1726882676.67896: checking for max_fail_percentage 30529 1726882676.67898: done checking for max_fail_percentage 30529 1726882676.67898: checking to see if all hosts have failed and the running result is not ok 30529 1726882676.67899: done checking to see if all hosts have failed 30529 1726882676.67900: getting the remaining hosts for this loop 30529 1726882676.67901: done getting the remaining hosts for this loop 30529 1726882676.67904: getting the next task for host managed_node1 30529 1726882676.67910: done getting next task for host managed_node1 30529 1726882676.67913: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882676.67919: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882676.67929: getting variables 30529 1726882676.67930: in VariableManager get_vars() 30529 1726882676.67958: Calling all_inventory to load vars for managed_node1 30529 1726882676.67960: Calling groups_inventory to load vars for managed_node1 30529 1726882676.67962: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882676.67973: Calling all_plugins_play to load vars for managed_node1 30529 1726882676.67976: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882676.67978: Calling groups_plugins_play to load vars for managed_node1 30529 1726882676.69800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882676.71350: done with get_vars() 30529 1726882676.71378: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:56 -0400 (0:00:01.834) 0:01:30.740 ****** 30529 1726882676.71474: entering _queue_task() for managed_node1/package_facts 30529 1726882676.72012: worker is 1 (out of 1 available) 30529 1726882676.72022: exiting _queue_task() for managed_node1/package_facts 30529 1726882676.72034: done queuing things up, now waiting for results queue to drain 30529 1726882676.72035: waiting for pending results... 30529 1726882676.72163: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882676.72370: in run() - task 12673a56-9f93-b0f1-edc0-000000001d88 30529 1726882676.72375: variable 'ansible_search_path' from source: unknown 30529 1726882676.72378: variable 'ansible_search_path' from source: unknown 30529 1726882676.72391: calling self._execute() 30529 1726882676.72495: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882676.72508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882676.72523: variable 'omit' from source: magic vars 30529 1726882676.72897: variable 'ansible_distribution_major_version' from source: facts 30529 1726882676.72920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882676.73016: variable 'omit' from source: magic vars 30529 1726882676.73019: variable 'omit' from source: magic vars 30529 1726882676.73058: variable 'omit' from source: magic vars 30529 1726882676.73103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882676.73147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882676.73172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882676.73196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882676.73213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882676.73253: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882676.73262: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882676.73270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882676.73379: Set connection var ansible_shell_executable to /bin/sh 30529 1726882676.73389: Set connection var ansible_pipelining to False 30529 1726882676.73400: Set connection var ansible_shell_type to sh 30529 1726882676.73415: Set connection var ansible_timeout to 10 30529 1726882676.73422: Set connection var ansible_connection to ssh 30529 1726882676.73430: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882676.73498: variable 'ansible_shell_executable' from source: unknown 30529 1726882676.73501: variable 'ansible_connection' from source: unknown 30529 1726882676.73504: variable 'ansible_module_compression' from source: unknown 30529 1726882676.73506: variable 'ansible_shell_type' from source: unknown 30529 1726882676.73508: variable 'ansible_shell_executable' from source: unknown 30529 1726882676.73510: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882676.73512: variable 'ansible_pipelining' from source: unknown 30529 1726882676.73514: variable 'ansible_timeout' from source: unknown 30529 1726882676.73516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882676.73703: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882676.73718: variable 'omit' from source: magic vars 30529 1726882676.73727: starting attempt loop 30529 1726882676.73773: running the handler 30529 1726882676.73776: _low_level_execute_command(): starting 30529 1726882676.73778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882676.74519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882676.74566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882676.74606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882676.74625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.74699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.76343: stdout chunk (state=3): >>>/root <<< 30529 1726882676.76398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882676.76411: stdout chunk (state=3): >>><<< 30529 1726882676.76531: stderr chunk (state=3): >>><<< 30529 1726882676.76535: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882676.76538: _low_level_execute_command(): starting 30529 1726882676.76550: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979 `" && echo ansible-tmp-1726882676.7651176-34815-270360643217979="` echo /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979 `" ) && sleep 0' 30529 1726882676.77678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882676.77709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882676.77759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882676.78010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.78072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.79916: stdout chunk (state=3): >>>ansible-tmp-1726882676.7651176-34815-270360643217979=/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979 <<< 30529 1726882676.80069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882676.80072: stdout chunk (state=3): >>><<< 30529 1726882676.80075: stderr chunk (state=3): >>><<< 30529 1726882676.80159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882676.7651176-34815-270360643217979=/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882676.80162: variable 'ansible_module_compression' from source: unknown 30529 1726882676.80188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882676.80247: variable 'ansible_facts' from source: unknown 30529 1726882676.80456: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py 30529 1726882676.80609: Sending initial data 30529 1726882676.80618: Sent initial data (162 bytes) 30529 1726882676.81210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882676.81257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882676.81274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882676.81287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882676.81306: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882676.81382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882676.81406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882676.81423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.81489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.82988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882676.83032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882676.83069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwq7ugocz /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py <<< 30529 1726882676.83076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py" <<< 30529 1726882676.83108: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwq7ugocz" to remote "/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py" <<< 30529 1726882676.83116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py" <<< 30529 1726882676.84128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882676.84168: stderr chunk (state=3): >>><<< 30529 1726882676.84171: stdout chunk (state=3): >>><<< 30529 1726882676.84195: done transferring module to remote 30529 1726882676.84207: _low_level_execute_command(): starting 30529 1726882676.84210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/ /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py && sleep 0' 30529 1726882676.84876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882676.84879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882676.84881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882676.84886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.84917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882676.86633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882676.86663: stderr chunk (state=3): >>><<< 30529 1726882676.86670: stdout chunk (state=3): >>><<< 30529 1726882676.86694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882676.86699: _low_level_execute_command(): starting 30529 1726882676.86702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/AnsiballZ_package_facts.py && sleep 0' 30529 1726882676.87077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882676.87081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882676.87115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882676.87119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882676.87166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882676.87169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882676.87219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882677.30748: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30529 1726882677.30846: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30529 1726882677.30919: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882677.30931: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30529 1726882677.30935: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882677.32688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882677.32692: stdout chunk (state=3): >>><<< 30529 1726882677.32697: stderr chunk (state=3): >>><<< 30529 1726882677.32909: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882677.46506: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882677.46511: _low_level_execute_command(): starting 30529 1726882677.46514: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882676.7651176-34815-270360643217979/ > /dev/null 2>&1 && sleep 0' 30529 1726882677.47553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882677.47568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882677.47579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882677.47840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882677.47844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882677.47935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882677.48049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882677.49920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882677.49962: stderr chunk (state=3): >>><<< 30529 1726882677.49966: stdout chunk (state=3): >>><<< 30529 1726882677.50311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882677.50315: handler run complete 30529 1726882677.51903: variable 'ansible_facts' from source: unknown 30529 1726882677.52998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882677.57807: variable 'ansible_facts' from source: unknown 30529 1726882677.58629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882677.60667: attempt loop complete, returning result 30529 1726882677.60671: _execute() done 30529 1726882677.60673: dumping result to json 30529 1726882677.61278: done dumping result, returning 30529 1726882677.61412: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000001d88] 30529 1726882677.61550: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d88 30529 1726882677.80708: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d88 30529 1726882677.80712: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882677.80826: no more pending results, returning what we have 30529 1726882677.80829: results queue empty 30529 1726882677.80830: checking for any_errors_fatal 30529 1726882677.80834: done checking for any_errors_fatal 30529 1726882677.80835: checking for max_fail_percentage 30529 1726882677.80836: done checking for max_fail_percentage 30529 1726882677.80837: checking to see if all hosts have failed and the running result is not ok 30529 1726882677.80837: done checking to see if all hosts have failed 30529 1726882677.80838: getting the remaining hosts for this loop 30529 1726882677.80839: done getting the remaining hosts for this loop 30529 1726882677.80842: getting the next task for host managed_node1 30529 1726882677.80848: done getting next task for host managed_node1 30529 1726882677.80851: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882677.80856: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882677.80866: getting variables 30529 1726882677.80867: in VariableManager get_vars() 30529 1726882677.80890: Calling all_inventory to load vars for managed_node1 30529 1726882677.80892: Calling groups_inventory to load vars for managed_node1 30529 1726882677.80896: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882677.80903: Calling all_plugins_play to load vars for managed_node1 30529 1726882677.80906: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882677.80909: Calling groups_plugins_play to load vars for managed_node1 30529 1726882677.83236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882677.86876: done with get_vars() 30529 1726882677.86909: done getting variables 30529 1726882677.86958: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:57 -0400 (0:00:01.155) 0:01:31.898 ****** 30529 1726882677.87203: entering _queue_task() for managed_node1/debug 30529 1726882677.87766: worker is 1 (out of 1 available) 30529 1726882677.87779: exiting _queue_task() for managed_node1/debug 30529 1726882677.88196: done queuing things up, now waiting for results queue to drain 30529 1726882677.88198: waiting for pending results... 30529 1726882677.88436: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882677.89002: in run() - task 12673a56-9f93-b0f1-edc0-000000001d2c 30529 1726882677.89006: variable 'ansible_search_path' from source: unknown 30529 1726882677.89008: variable 'ansible_search_path' from source: unknown 30529 1726882677.89013: calling self._execute() 30529 1726882677.89015: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882677.89110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882677.89126: variable 'omit' from source: magic vars 30529 1726882677.89894: variable 'ansible_distribution_major_version' from source: facts 30529 1726882677.89939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882677.89952: variable 'omit' from source: magic vars 30529 1726882677.90071: variable 'omit' from source: magic vars 30529 1726882677.90360: variable 'network_provider' from source: set_fact 30529 1726882677.90385: variable 'omit' from source: magic vars 30529 1726882677.90435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882677.90690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882677.90696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882677.90698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882677.90701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882677.90906: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882677.90910: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882677.90913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882677.91038: Set connection var ansible_shell_executable to /bin/sh 30529 1726882677.91049: Set connection var ansible_pipelining to False 30529 1726882677.91056: Set connection var ansible_shell_type to sh 30529 1726882677.91069: Set connection var ansible_timeout to 10 30529 1726882677.91077: Set connection var ansible_connection to ssh 30529 1726882677.91086: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882677.91200: variable 'ansible_shell_executable' from source: unknown 30529 1726882677.91211: variable 'ansible_connection' from source: unknown 30529 1726882677.91221: variable 'ansible_module_compression' from source: unknown 30529 1726882677.91235: variable 'ansible_shell_type' from source: unknown 30529 1726882677.91242: variable 'ansible_shell_executable' from source: unknown 30529 1726882677.91250: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882677.91258: variable 'ansible_pipelining' from source: unknown 30529 1726882677.91265: variable 'ansible_timeout' from source: unknown 30529 1726882677.91272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882677.91534: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882677.91574: variable 'omit' from source: magic vars 30529 1726882677.91777: starting attempt loop 30529 1726882677.91781: running the handler 30529 1726882677.91784: handler run complete 30529 1726882677.91790: attempt loop complete, returning result 30529 1726882677.91792: _execute() done 30529 1726882677.91797: dumping result to json 30529 1726882677.91800: done dumping result, returning 30529 1726882677.91802: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000001d2c] 30529 1726882677.91804: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2c ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882677.92065: no more pending results, returning what we have 30529 1726882677.92069: results queue empty 30529 1726882677.92070: checking for any_errors_fatal 30529 1726882677.92083: done checking for any_errors_fatal 30529 1726882677.92084: checking for max_fail_percentage 30529 1726882677.92086: done checking for max_fail_percentage 30529 1726882677.92090: checking to see if all hosts have failed and the running result is not ok 30529 1726882677.92091: done checking to see if all hosts have failed 30529 1726882677.92092: getting the remaining hosts for this loop 30529 1726882677.92095: done getting the remaining hosts for this loop 30529 1726882677.92099: getting the next task for host managed_node1 30529 1726882677.92108: done getting next task for host managed_node1 30529 1726882677.92112: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882677.92118: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882677.92131: getting variables 30529 1726882677.92134: in VariableManager get_vars() 30529 1726882677.92178: Calling all_inventory to load vars for managed_node1 30529 1726882677.92181: Calling groups_inventory to load vars for managed_node1 30529 1726882677.92183: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882677.92504: Calling all_plugins_play to load vars for managed_node1 30529 1726882677.92508: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882677.92513: Calling groups_plugins_play to load vars for managed_node1 30529 1726882677.93399: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2c 30529 1726882677.93402: WORKER PROCESS EXITING 30529 1726882677.95496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882677.98722: done with get_vars() 30529 1726882677.98751: done getting variables 30529 1726882677.99019: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:57 -0400 (0:00:00.118) 0:01:32.016 ****** 30529 1726882677.99066: entering _queue_task() for managed_node1/fail 30529 1726882678.00027: worker is 1 (out of 1 available) 30529 1726882678.00037: exiting _queue_task() for managed_node1/fail 30529 1726882678.00047: done queuing things up, now waiting for results queue to drain 30529 1726882678.00049: waiting for pending results... 30529 1726882678.00274: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882678.00666: in run() - task 12673a56-9f93-b0f1-edc0-000000001d2d 30529 1726882678.00692: variable 'ansible_search_path' from source: unknown 30529 1726882678.00724: variable 'ansible_search_path' from source: unknown 30529 1726882678.00841: calling self._execute() 30529 1726882678.00984: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.01054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.01069: variable 'omit' from source: magic vars 30529 1726882678.01916: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.01933: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.02392: variable 'network_state' from source: role '' defaults 30529 1726882678.02398: Evaluated conditional (network_state != {}): False 30529 1726882678.02400: when evaluation is False, skipping this task 30529 1726882678.02403: _execute() done 30529 1726882678.02405: dumping result to json 30529 1726882678.02408: done dumping result, returning 30529 1726882678.02411: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000001d2d] 30529 1726882678.02415: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2d 30529 1726882678.02495: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2d 30529 1726882678.02498: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882678.02548: no more pending results, returning what we have 30529 1726882678.02551: results queue empty 30529 1726882678.02552: checking for any_errors_fatal 30529 1726882678.02560: done checking for any_errors_fatal 30529 1726882678.02561: checking for max_fail_percentage 30529 1726882678.02562: done checking for max_fail_percentage 30529 1726882678.02563: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.02564: done checking to see if all hosts have failed 30529 1726882678.02565: getting the remaining hosts for this loop 30529 1726882678.02567: done getting the remaining hosts for this loop 30529 1726882678.02570: getting the next task for host managed_node1 30529 1726882678.02581: done getting next task for host managed_node1 30529 1726882678.02585: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882678.02595: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.02624: getting variables 30529 1726882678.02626: in VariableManager get_vars() 30529 1726882678.02670: Calling all_inventory to load vars for managed_node1 30529 1726882678.02672: Calling groups_inventory to load vars for managed_node1 30529 1726882678.02674: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.02690: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.02997: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.03002: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.06743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.11559: done with get_vars() 30529 1726882678.11589: done getting variables 30529 1726882678.12053: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:58 -0400 (0:00:00.130) 0:01:32.147 ****** 30529 1726882678.12094: entering _queue_task() for managed_node1/fail 30529 1726882678.13326: worker is 1 (out of 1 available) 30529 1726882678.13336: exiting _queue_task() for managed_node1/fail 30529 1726882678.13347: done queuing things up, now waiting for results queue to drain 30529 1726882678.13349: waiting for pending results... 30529 1726882678.13739: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882678.14165: in run() - task 12673a56-9f93-b0f1-edc0-000000001d2e 30529 1726882678.14169: variable 'ansible_search_path' from source: unknown 30529 1726882678.14172: variable 'ansible_search_path' from source: unknown 30529 1726882678.14175: calling self._execute() 30529 1726882678.14495: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.14501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.14504: variable 'omit' from source: magic vars 30529 1726882678.15300: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.15448: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.15748: variable 'network_state' from source: role '' defaults 30529 1726882678.15763: Evaluated conditional (network_state != {}): False 30529 1726882678.15816: when evaluation is False, skipping this task 30529 1726882678.15840: _execute() done 30529 1726882678.15845: dumping result to json 30529 1726882678.15849: done dumping result, returning 30529 1726882678.15861: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000001d2e] 30529 1726882678.15949: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882678.16097: no more pending results, returning what we have 30529 1726882678.16100: results queue empty 30529 1726882678.16102: checking for any_errors_fatal 30529 1726882678.16110: done checking for any_errors_fatal 30529 1726882678.16111: checking for max_fail_percentage 30529 1726882678.16112: done checking for max_fail_percentage 30529 1726882678.16113: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.16114: done checking to see if all hosts have failed 30529 1726882678.16115: getting the remaining hosts for this loop 30529 1726882678.16117: done getting the remaining hosts for this loop 30529 1726882678.16121: getting the next task for host managed_node1 30529 1726882678.16130: done getting next task for host managed_node1 30529 1726882678.16133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882678.16138: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.16165: getting variables 30529 1726882678.16167: in VariableManager get_vars() 30529 1726882678.16213: Calling all_inventory to load vars for managed_node1 30529 1726882678.16215: Calling groups_inventory to load vars for managed_node1 30529 1726882678.16218: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.16231: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.16234: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.16237: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.16972: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2e 30529 1726882678.16975: WORKER PROCESS EXITING 30529 1726882678.20589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.24188: done with get_vars() 30529 1726882678.24248: done getting variables 30529 1726882678.24316: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:58 -0400 (0:00:00.123) 0:01:32.270 ****** 30529 1726882678.24470: entering _queue_task() for managed_node1/fail 30529 1726882678.25263: worker is 1 (out of 1 available) 30529 1726882678.25277: exiting _queue_task() for managed_node1/fail 30529 1726882678.25289: done queuing things up, now waiting for results queue to drain 30529 1726882678.25290: waiting for pending results... 30529 1726882678.25851: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882678.26500: in run() - task 12673a56-9f93-b0f1-edc0-000000001d2f 30529 1726882678.26504: variable 'ansible_search_path' from source: unknown 30529 1726882678.26508: variable 'ansible_search_path' from source: unknown 30529 1726882678.26512: calling self._execute() 30529 1726882678.26514: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.26517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.26519: variable 'omit' from source: magic vars 30529 1726882678.27266: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.27698: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.27702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882678.33014: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882678.33271: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882678.33503: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882678.33542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882678.33899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882678.34099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.34103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.34106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.34108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.34111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.34246: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.34418: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882678.34548: variable 'ansible_distribution' from source: facts 30529 1726882678.34998: variable '__network_rh_distros' from source: role '' defaults 30529 1726882678.35002: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882678.35071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.35325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.35355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.35402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.35422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.35473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.35506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.35725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.35768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.35789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.35836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.35865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.35899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.35939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.36116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.36644: variable 'network_connections' from source: include params 30529 1726882678.36912: variable 'interface' from source: play vars 30529 1726882678.37100: variable 'interface' from source: play vars 30529 1726882678.37104: variable 'network_state' from source: role '' defaults 30529 1726882678.37106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882678.37253: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882678.37537: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882678.37575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882678.37616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882678.37664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882678.37998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882678.38012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.38015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882678.38023: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882678.38032: when evaluation is False, skipping this task 30529 1726882678.38039: _execute() done 30529 1726882678.38046: dumping result to json 30529 1726882678.38052: done dumping result, returning 30529 1726882678.38063: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000001d2f] 30529 1726882678.38072: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2f skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882678.38225: no more pending results, returning what we have 30529 1726882678.38228: results queue empty 30529 1726882678.38229: checking for any_errors_fatal 30529 1726882678.38236: done checking for any_errors_fatal 30529 1726882678.38236: checking for max_fail_percentage 30529 1726882678.38243: done checking for max_fail_percentage 30529 1726882678.38244: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.38245: done checking to see if all hosts have failed 30529 1726882678.38245: getting the remaining hosts for this loop 30529 1726882678.38247: done getting the remaining hosts for this loop 30529 1726882678.38251: getting the next task for host managed_node1 30529 1726882678.38261: done getting next task for host managed_node1 30529 1726882678.38265: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882678.38269: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.38304: getting variables 30529 1726882678.38306: in VariableManager get_vars() 30529 1726882678.38685: Calling all_inventory to load vars for managed_node1 30529 1726882678.38688: Calling groups_inventory to load vars for managed_node1 30529 1726882678.38690: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.38699: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d2f 30529 1726882678.38702: WORKER PROCESS EXITING 30529 1726882678.38796: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.38800: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.38803: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.41728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.44306: done with get_vars() 30529 1726882678.44329: done getting variables 30529 1726882678.44417: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:58 -0400 (0:00:00.199) 0:01:32.470 ****** 30529 1726882678.44457: entering _queue_task() for managed_node1/dnf 30529 1726882678.44963: worker is 1 (out of 1 available) 30529 1726882678.44975: exiting _queue_task() for managed_node1/dnf 30529 1726882678.44990: done queuing things up, now waiting for results queue to drain 30529 1726882678.44992: waiting for pending results... 30529 1726882678.45303: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882678.45473: in run() - task 12673a56-9f93-b0f1-edc0-000000001d30 30529 1726882678.45497: variable 'ansible_search_path' from source: unknown 30529 1726882678.45507: variable 'ansible_search_path' from source: unknown 30529 1726882678.45551: calling self._execute() 30529 1726882678.45654: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.45674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.45691: variable 'omit' from source: magic vars 30529 1726882678.46086: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.46116: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.46329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882678.49190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882678.49362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882678.49367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882678.49386: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882678.49479: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882678.49573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.49646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.49697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.49801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.49804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.49885: variable 'ansible_distribution' from source: facts 30529 1726882678.49900: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.49931: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882678.50052: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882678.50236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.50239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.50251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.50302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.50322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.50377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.50408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.50434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.50486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.50510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.50563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.50683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.50687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.50692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.50695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.50858: variable 'network_connections' from source: include params 30529 1726882678.50876: variable 'interface' from source: play vars 30529 1726882678.50954: variable 'interface' from source: play vars 30529 1726882678.51043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882678.51228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882678.51274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882678.51314: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882678.51360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882678.51449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882678.51520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882678.51880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.51884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882678.51886: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882678.52427: variable 'network_connections' from source: include params 30529 1726882678.52445: variable 'interface' from source: play vars 30529 1726882678.52666: variable 'interface' from source: play vars 30529 1726882678.52670: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882678.52673: when evaluation is False, skipping this task 30529 1726882678.52675: _execute() done 30529 1726882678.52677: dumping result to json 30529 1726882678.52679: done dumping result, returning 30529 1726882678.52684: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001d30] 30529 1726882678.52700: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d30 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882678.52916: no more pending results, returning what we have 30529 1726882678.52920: results queue empty 30529 1726882678.52921: checking for any_errors_fatal 30529 1726882678.52929: done checking for any_errors_fatal 30529 1726882678.52929: checking for max_fail_percentage 30529 1726882678.52931: done checking for max_fail_percentage 30529 1726882678.52932: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.52933: done checking to see if all hosts have failed 30529 1726882678.52934: getting the remaining hosts for this loop 30529 1726882678.52936: done getting the remaining hosts for this loop 30529 1726882678.52940: getting the next task for host managed_node1 30529 1726882678.52948: done getting next task for host managed_node1 30529 1726882678.52952: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882678.52957: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.52983: getting variables 30529 1726882678.52985: in VariableManager get_vars() 30529 1726882678.53342: Calling all_inventory to load vars for managed_node1 30529 1726882678.53345: Calling groups_inventory to load vars for managed_node1 30529 1726882678.53348: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.53358: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.53362: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.53365: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.54100: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d30 30529 1726882678.54103: WORKER PROCESS EXITING 30529 1726882678.57753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.63029: done with get_vars() 30529 1726882678.63123: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882678.63326: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:58 -0400 (0:00:00.189) 0:01:32.659 ****** 30529 1726882678.63360: entering _queue_task() for managed_node1/yum 30529 1726882678.64460: worker is 1 (out of 1 available) 30529 1726882678.64472: exiting _queue_task() for managed_node1/yum 30529 1726882678.64484: done queuing things up, now waiting for results queue to drain 30529 1726882678.64486: waiting for pending results... 30529 1726882678.64879: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882678.65253: in run() - task 12673a56-9f93-b0f1-edc0-000000001d31 30529 1726882678.65312: variable 'ansible_search_path' from source: unknown 30529 1726882678.65322: variable 'ansible_search_path' from source: unknown 30529 1726882678.65367: calling self._execute() 30529 1726882678.65653: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.65726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.65730: variable 'omit' from source: magic vars 30529 1726882678.66753: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.66756: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.67052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882678.72172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882678.72307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882678.72417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882678.72519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882678.72622: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882678.72909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.73718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.73780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.73914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.73990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.74220: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.74248: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882678.74256: when evaluation is False, skipping this task 30529 1726882678.74337: _execute() done 30529 1726882678.74341: dumping result to json 30529 1726882678.74343: done dumping result, returning 30529 1726882678.74345: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001d31] 30529 1726882678.74348: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d31 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882678.74800: no more pending results, returning what we have 30529 1726882678.74805: results queue empty 30529 1726882678.74806: checking for any_errors_fatal 30529 1726882678.74811: done checking for any_errors_fatal 30529 1726882678.74812: checking for max_fail_percentage 30529 1726882678.74814: done checking for max_fail_percentage 30529 1726882678.74815: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.74817: done checking to see if all hosts have failed 30529 1726882678.74817: getting the remaining hosts for this loop 30529 1726882678.74819: done getting the remaining hosts for this loop 30529 1726882678.74823: getting the next task for host managed_node1 30529 1726882678.74833: done getting next task for host managed_node1 30529 1726882678.74837: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882678.74843: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.74870: getting variables 30529 1726882678.74872: in VariableManager get_vars() 30529 1726882678.74921: Calling all_inventory to load vars for managed_node1 30529 1726882678.74924: Calling groups_inventory to load vars for managed_node1 30529 1726882678.74927: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.74938: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.74942: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.74945: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.76444: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d31 30529 1726882678.77072: WORKER PROCESS EXITING 30529 1726882678.78923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.82183: done with get_vars() 30529 1726882678.82329: done getting variables 30529 1726882678.82386: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:58 -0400 (0:00:00.191) 0:01:32.851 ****** 30529 1726882678.82527: entering _queue_task() for managed_node1/fail 30529 1726882678.83172: worker is 1 (out of 1 available) 30529 1726882678.83185: exiting _queue_task() for managed_node1/fail 30529 1726882678.83197: done queuing things up, now waiting for results queue to drain 30529 1726882678.83198: waiting for pending results... 30529 1726882678.83412: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882678.83557: in run() - task 12673a56-9f93-b0f1-edc0-000000001d32 30529 1726882678.83573: variable 'ansible_search_path' from source: unknown 30529 1726882678.83577: variable 'ansible_search_path' from source: unknown 30529 1726882678.83618: calling self._execute() 30529 1726882678.83726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.83730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.83741: variable 'omit' from source: magic vars 30529 1726882678.84136: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.84153: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.84278: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882678.84482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882678.88816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882678.88952: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882678.88956: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882678.88983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882678.89020: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882678.89113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.89169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.89205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.89250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.89277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.89389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.89395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.89398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.89439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.89457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.89509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882678.89540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882678.89568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.89619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882678.89699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882678.89855: variable 'network_connections' from source: include params 30529 1726882678.89966: variable 'interface' from source: play vars 30529 1726882678.90132: variable 'interface' from source: play vars 30529 1726882678.90504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882678.90898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882678.90902: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882678.90904: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882678.90906: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882678.90908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882678.91012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882678.91185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882678.91217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882678.91279: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882678.91895: variable 'network_connections' from source: include params 30529 1726882678.91950: variable 'interface' from source: play vars 30529 1726882678.92084: variable 'interface' from source: play vars 30529 1726882678.92118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882678.92190: when evaluation is False, skipping this task 30529 1726882678.92201: _execute() done 30529 1726882678.92209: dumping result to json 30529 1726882678.92216: done dumping result, returning 30529 1726882678.92228: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001d32] 30529 1726882678.92350: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d32 30529 1726882678.92442: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d32 30529 1726882678.92446: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882678.92501: no more pending results, returning what we have 30529 1726882678.92505: results queue empty 30529 1726882678.92506: checking for any_errors_fatal 30529 1726882678.92512: done checking for any_errors_fatal 30529 1726882678.92513: checking for max_fail_percentage 30529 1726882678.92515: done checking for max_fail_percentage 30529 1726882678.92515: checking to see if all hosts have failed and the running result is not ok 30529 1726882678.92516: done checking to see if all hosts have failed 30529 1726882678.92517: getting the remaining hosts for this loop 30529 1726882678.92519: done getting the remaining hosts for this loop 30529 1726882678.92522: getting the next task for host managed_node1 30529 1726882678.92530: done getting next task for host managed_node1 30529 1726882678.92534: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882678.92538: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882678.92562: getting variables 30529 1726882678.92564: in VariableManager get_vars() 30529 1726882678.92612: Calling all_inventory to load vars for managed_node1 30529 1726882678.92614: Calling groups_inventory to load vars for managed_node1 30529 1726882678.92617: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882678.92628: Calling all_plugins_play to load vars for managed_node1 30529 1726882678.92631: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882678.92637: Calling groups_plugins_play to load vars for managed_node1 30529 1726882678.95240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882678.97346: done with get_vars() 30529 1726882678.97378: done getting variables 30529 1726882678.97546: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:58 -0400 (0:00:00.150) 0:01:33.001 ****** 30529 1726882678.97583: entering _queue_task() for managed_node1/package 30529 1726882678.98225: worker is 1 (out of 1 available) 30529 1726882678.98238: exiting _queue_task() for managed_node1/package 30529 1726882678.98250: done queuing things up, now waiting for results queue to drain 30529 1726882678.98252: waiting for pending results... 30529 1726882678.98629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882678.98800: in run() - task 12673a56-9f93-b0f1-edc0-000000001d33 30529 1726882678.98804: variable 'ansible_search_path' from source: unknown 30529 1726882678.98808: variable 'ansible_search_path' from source: unknown 30529 1726882678.98812: calling self._execute() 30529 1726882678.98884: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882678.98891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882678.98900: variable 'omit' from source: magic vars 30529 1726882678.99307: variable 'ansible_distribution_major_version' from source: facts 30529 1726882678.99318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882678.99597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882678.99808: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882678.99850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882678.99882: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882678.99961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882679.00080: variable 'network_packages' from source: role '' defaults 30529 1726882679.00242: variable '__network_provider_setup' from source: role '' defaults 30529 1726882679.00252: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882679.00357: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882679.00374: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882679.00469: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882679.00751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882679.04187: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882679.04304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882679.04334: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882679.04378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882679.04498: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882679.04526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.04554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.04585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.04658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.04674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.04730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.04802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.04806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.04826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.04841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.05145: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882679.05287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.05316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.05343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.05480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.05484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.05527: variable 'ansible_python' from source: facts 30529 1726882679.05542: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882679.05642: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882679.05728: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882679.05864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.05889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.05916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.05963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.05976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.06030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.06107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.06112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.06298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.06301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.06305: variable 'network_connections' from source: include params 30529 1726882679.06307: variable 'interface' from source: play vars 30529 1726882679.06423: variable 'interface' from source: play vars 30529 1726882679.06504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882679.06529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882679.06560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.06592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882679.06652: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882679.06978: variable 'network_connections' from source: include params 30529 1726882679.06981: variable 'interface' from source: play vars 30529 1726882679.07075: variable 'interface' from source: play vars 30529 1726882679.07111: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882679.07186: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882679.07488: variable 'network_connections' from source: include params 30529 1726882679.07496: variable 'interface' from source: play vars 30529 1726882679.07554: variable 'interface' from source: play vars 30529 1726882679.07574: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882679.07656: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882679.07997: variable 'network_connections' from source: include params 30529 1726882679.08000: variable 'interface' from source: play vars 30529 1726882679.08186: variable 'interface' from source: play vars 30529 1726882679.08194: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882679.08287: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882679.08296: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882679.08389: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882679.08706: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882679.09499: variable 'network_connections' from source: include params 30529 1726882679.09503: variable 'interface' from source: play vars 30529 1726882679.09584: variable 'interface' from source: play vars 30529 1726882679.09615: variable 'ansible_distribution' from source: facts 30529 1726882679.09626: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.09646: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.09663: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882679.09891: variable 'ansible_distribution' from source: facts 30529 1726882679.09907: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.09943: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.10048: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882679.10210: variable 'ansible_distribution' from source: facts 30529 1726882679.10220: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.10233: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.10282: variable 'network_provider' from source: set_fact 30529 1726882679.10384: variable 'ansible_facts' from source: unknown 30529 1726882679.11079: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882679.11089: when evaluation is False, skipping this task 30529 1726882679.11101: _execute() done 30529 1726882679.11108: dumping result to json 30529 1726882679.11115: done dumping result, returning 30529 1726882679.11129: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000001d33] 30529 1726882679.11147: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d33 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882679.11325: no more pending results, returning what we have 30529 1726882679.11329: results queue empty 30529 1726882679.11330: checking for any_errors_fatal 30529 1726882679.11337: done checking for any_errors_fatal 30529 1726882679.11338: checking for max_fail_percentage 30529 1726882679.11340: done checking for max_fail_percentage 30529 1726882679.11341: checking to see if all hosts have failed and the running result is not ok 30529 1726882679.11342: done checking to see if all hosts have failed 30529 1726882679.11342: getting the remaining hosts for this loop 30529 1726882679.11345: done getting the remaining hosts for this loop 30529 1726882679.11349: getting the next task for host managed_node1 30529 1726882679.11359: done getting next task for host managed_node1 30529 1726882679.11363: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882679.11368: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882679.11400: getting variables 30529 1726882679.11403: in VariableManager get_vars() 30529 1726882679.11448: Calling all_inventory to load vars for managed_node1 30529 1726882679.11451: Calling groups_inventory to load vars for managed_node1 30529 1726882679.11457: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882679.11468: Calling all_plugins_play to load vars for managed_node1 30529 1726882679.11471: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882679.11474: Calling groups_plugins_play to load vars for managed_node1 30529 1726882679.12225: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d33 30529 1726882679.12228: WORKER PROCESS EXITING 30529 1726882679.13899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882679.15913: done with get_vars() 30529 1726882679.15940: done getting variables 30529 1726882679.16052: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:59 -0400 (0:00:00.185) 0:01:33.187 ****** 30529 1726882679.16106: entering _queue_task() for managed_node1/package 30529 1726882679.16599: worker is 1 (out of 1 available) 30529 1726882679.16611: exiting _queue_task() for managed_node1/package 30529 1726882679.16683: done queuing things up, now waiting for results queue to drain 30529 1726882679.16688: waiting for pending results... 30529 1726882679.16966: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882679.17258: in run() - task 12673a56-9f93-b0f1-edc0-000000001d34 30529 1726882679.17286: variable 'ansible_search_path' from source: unknown 30529 1726882679.17302: variable 'ansible_search_path' from source: unknown 30529 1726882679.17355: calling self._execute() 30529 1726882679.17504: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.17698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.17703: variable 'omit' from source: magic vars 30529 1726882679.18077: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.18098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882679.18265: variable 'network_state' from source: role '' defaults 30529 1726882679.18291: Evaluated conditional (network_state != {}): False 30529 1726882679.18311: when evaluation is False, skipping this task 30529 1726882679.18327: _execute() done 30529 1726882679.18335: dumping result to json 30529 1726882679.18341: done dumping result, returning 30529 1726882679.18353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001d34] 30529 1726882679.18370: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d34 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882679.18653: no more pending results, returning what we have 30529 1726882679.18657: results queue empty 30529 1726882679.18658: checking for any_errors_fatal 30529 1726882679.18666: done checking for any_errors_fatal 30529 1726882679.18667: checking for max_fail_percentage 30529 1726882679.18669: done checking for max_fail_percentage 30529 1726882679.18670: checking to see if all hosts have failed and the running result is not ok 30529 1726882679.18671: done checking to see if all hosts have failed 30529 1726882679.18672: getting the remaining hosts for this loop 30529 1726882679.18674: done getting the remaining hosts for this loop 30529 1726882679.18678: getting the next task for host managed_node1 30529 1726882679.18689: done getting next task for host managed_node1 30529 1726882679.18694: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882679.18701: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882679.18730: getting variables 30529 1726882679.18732: in VariableManager get_vars() 30529 1726882679.18891: Calling all_inventory to load vars for managed_node1 30529 1726882679.18896: Calling groups_inventory to load vars for managed_node1 30529 1726882679.18899: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882679.18910: Calling all_plugins_play to load vars for managed_node1 30529 1726882679.18913: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882679.18916: Calling groups_plugins_play to load vars for managed_node1 30529 1726882679.19765: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d34 30529 1726882679.19771: WORKER PROCESS EXITING 30529 1726882679.22344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882679.24783: done with get_vars() 30529 1726882679.24807: done getting variables 30529 1726882679.24868: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:59 -0400 (0:00:00.088) 0:01:33.275 ****** 30529 1726882679.24963: entering _queue_task() for managed_node1/package 30529 1726882679.25466: worker is 1 (out of 1 available) 30529 1726882679.25478: exiting _queue_task() for managed_node1/package 30529 1726882679.25490: done queuing things up, now waiting for results queue to drain 30529 1726882679.25492: waiting for pending results... 30529 1726882679.25841: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882679.26291: in run() - task 12673a56-9f93-b0f1-edc0-000000001d35 30529 1726882679.26338: variable 'ansible_search_path' from source: unknown 30529 1726882679.26372: variable 'ansible_search_path' from source: unknown 30529 1726882679.26414: calling self._execute() 30529 1726882679.26586: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.26602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.26617: variable 'omit' from source: magic vars 30529 1726882679.27217: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.27221: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882679.27421: variable 'network_state' from source: role '' defaults 30529 1726882679.27482: Evaluated conditional (network_state != {}): False 30529 1726882679.27491: when evaluation is False, skipping this task 30529 1726882679.27504: _execute() done 30529 1726882679.27529: dumping result to json 30529 1726882679.27540: done dumping result, returning 30529 1726882679.27551: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000001d35] 30529 1726882679.27566: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d35 30529 1726882679.27824: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d35 30529 1726882679.27828: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882679.27877: no more pending results, returning what we have 30529 1726882679.27881: results queue empty 30529 1726882679.27882: checking for any_errors_fatal 30529 1726882679.27889: done checking for any_errors_fatal 30529 1726882679.27889: checking for max_fail_percentage 30529 1726882679.27891: done checking for max_fail_percentage 30529 1726882679.27892: checking to see if all hosts have failed and the running result is not ok 30529 1726882679.27895: done checking to see if all hosts have failed 30529 1726882679.27896: getting the remaining hosts for this loop 30529 1726882679.27898: done getting the remaining hosts for this loop 30529 1726882679.27902: getting the next task for host managed_node1 30529 1726882679.27913: done getting next task for host managed_node1 30529 1726882679.27917: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882679.27923: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882679.27955: getting variables 30529 1726882679.27959: in VariableManager get_vars() 30529 1726882679.28130: Calling all_inventory to load vars for managed_node1 30529 1726882679.28134: Calling groups_inventory to load vars for managed_node1 30529 1726882679.28136: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882679.28168: Calling all_plugins_play to load vars for managed_node1 30529 1726882679.28174: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882679.28177: Calling groups_plugins_play to load vars for managed_node1 30529 1726882679.29919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882679.31539: done with get_vars() 30529 1726882679.31561: done getting variables 30529 1726882679.31624: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:59 -0400 (0:00:00.066) 0:01:33.342 ****** 30529 1726882679.31659: entering _queue_task() for managed_node1/service 30529 1726882679.31972: worker is 1 (out of 1 available) 30529 1726882679.31985: exiting _queue_task() for managed_node1/service 30529 1726882679.32200: done queuing things up, now waiting for results queue to drain 30529 1726882679.32202: waiting for pending results... 30529 1726882679.32413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882679.32438: in run() - task 12673a56-9f93-b0f1-edc0-000000001d36 30529 1726882679.32456: variable 'ansible_search_path' from source: unknown 30529 1726882679.32463: variable 'ansible_search_path' from source: unknown 30529 1726882679.32525: calling self._execute() 30529 1726882679.32605: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.32615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.32633: variable 'omit' from source: magic vars 30529 1726882679.33069: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.33075: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882679.33159: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882679.33366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882679.35671: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882679.35742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882679.35795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882679.35833: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882679.35887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882679.35950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.36005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.36036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.36102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.36106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.36150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.36176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.36316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.36320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.36322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.36324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.36339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.36366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.36407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.36430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.36613: variable 'network_connections' from source: include params 30529 1726882679.36630: variable 'interface' from source: play vars 30529 1726882679.36707: variable 'interface' from source: play vars 30529 1726882679.36789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882679.36976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882679.37012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882679.37089: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882679.37092: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882679.37126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882679.37151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882679.37179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.37220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882679.37272: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882679.37529: variable 'network_connections' from source: include params 30529 1726882679.37539: variable 'interface' from source: play vars 30529 1726882679.37633: variable 'interface' from source: play vars 30529 1726882679.37636: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882679.37639: when evaluation is False, skipping this task 30529 1726882679.37645: _execute() done 30529 1726882679.37651: dumping result to json 30529 1726882679.37657: done dumping result, returning 30529 1726882679.37668: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000001d36] 30529 1726882679.37676: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d36 30529 1726882679.37945: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d36 30529 1726882679.37955: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882679.38004: no more pending results, returning what we have 30529 1726882679.38008: results queue empty 30529 1726882679.38009: checking for any_errors_fatal 30529 1726882679.38016: done checking for any_errors_fatal 30529 1726882679.38017: checking for max_fail_percentage 30529 1726882679.38019: done checking for max_fail_percentage 30529 1726882679.38020: checking to see if all hosts have failed and the running result is not ok 30529 1726882679.38021: done checking to see if all hosts have failed 30529 1726882679.38021: getting the remaining hosts for this loop 30529 1726882679.38023: done getting the remaining hosts for this loop 30529 1726882679.38027: getting the next task for host managed_node1 30529 1726882679.38036: done getting next task for host managed_node1 30529 1726882679.38040: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882679.38046: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882679.38075: getting variables 30529 1726882679.38077: in VariableManager get_vars() 30529 1726882679.38122: Calling all_inventory to load vars for managed_node1 30529 1726882679.38125: Calling groups_inventory to load vars for managed_node1 30529 1726882679.38127: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882679.38138: Calling all_plugins_play to load vars for managed_node1 30529 1726882679.38141: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882679.38144: Calling groups_plugins_play to load vars for managed_node1 30529 1726882679.39874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882679.41372: done with get_vars() 30529 1726882679.41398: done getting variables 30529 1726882679.41464: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:59 -0400 (0:00:00.098) 0:01:33.441 ****** 30529 1726882679.41503: entering _queue_task() for managed_node1/service 30529 1726882679.41977: worker is 1 (out of 1 available) 30529 1726882679.41988: exiting _queue_task() for managed_node1/service 30529 1726882679.42001: done queuing things up, now waiting for results queue to drain 30529 1726882679.42003: waiting for pending results... 30529 1726882679.42241: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882679.42448: in run() - task 12673a56-9f93-b0f1-edc0-000000001d37 30529 1726882679.42452: variable 'ansible_search_path' from source: unknown 30529 1726882679.42454: variable 'ansible_search_path' from source: unknown 30529 1726882679.42476: calling self._execute() 30529 1726882679.42583: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.42608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.42664: variable 'omit' from source: magic vars 30529 1726882679.43049: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.43068: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882679.43259: variable 'network_provider' from source: set_fact 30529 1726882679.43271: variable 'network_state' from source: role '' defaults 30529 1726882679.43288: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882679.43302: variable 'omit' from source: magic vars 30529 1726882679.43481: variable 'omit' from source: magic vars 30529 1726882679.43484: variable 'network_service_name' from source: role '' defaults 30529 1726882679.43495: variable 'network_service_name' from source: role '' defaults 30529 1726882679.43616: variable '__network_provider_setup' from source: role '' defaults 30529 1726882679.43629: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882679.43704: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882679.43809: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882679.43814: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882679.44032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882679.46351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882679.46428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882679.46466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882679.46518: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882679.46600: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882679.46638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.46674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.46754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.46757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.46759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.46807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.46831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.46854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.46901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.46919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.47188: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882679.47296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.47331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.47413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.47416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.47426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.47523: variable 'ansible_python' from source: facts 30529 1726882679.47549: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882679.47642: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882679.47722: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882679.47898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.47902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.47918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.47969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.47988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.48041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882679.48172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882679.48177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.48180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882679.48182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882679.48322: variable 'network_connections' from source: include params 30529 1726882679.48335: variable 'interface' from source: play vars 30529 1726882679.48417: variable 'interface' from source: play vars 30529 1726882679.48528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882679.48712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882679.48772: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882679.48820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882679.48871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882679.48953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882679.48987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882679.49014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882679.49037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882679.49079: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882679.49264: variable 'network_connections' from source: include params 30529 1726882679.49278: variable 'interface' from source: play vars 30529 1726882679.49329: variable 'interface' from source: play vars 30529 1726882679.49352: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882679.49411: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882679.49589: variable 'network_connections' from source: include params 30529 1726882679.49599: variable 'interface' from source: play vars 30529 1726882679.49650: variable 'interface' from source: play vars 30529 1726882679.49666: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882679.49724: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882679.49908: variable 'network_connections' from source: include params 30529 1726882679.49911: variable 'interface' from source: play vars 30529 1726882679.49963: variable 'interface' from source: play vars 30529 1726882679.50002: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882679.50049: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882679.50055: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882679.50100: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882679.50235: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882679.50542: variable 'network_connections' from source: include params 30529 1726882679.50545: variable 'interface' from source: play vars 30529 1726882679.50598: variable 'interface' from source: play vars 30529 1726882679.50604: variable 'ansible_distribution' from source: facts 30529 1726882679.50607: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.50615: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.50625: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882679.50846: variable 'ansible_distribution' from source: facts 30529 1726882679.50849: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.50851: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.50853: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882679.51100: variable 'ansible_distribution' from source: facts 30529 1726882679.51104: variable '__network_rh_distros' from source: role '' defaults 30529 1726882679.51106: variable 'ansible_distribution_major_version' from source: facts 30529 1726882679.51108: variable 'network_provider' from source: set_fact 30529 1726882679.51110: variable 'omit' from source: magic vars 30529 1726882679.51112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882679.51115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882679.51117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882679.51119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882679.51141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882679.51154: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882679.51157: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.51161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.51261: Set connection var ansible_shell_executable to /bin/sh 30529 1726882679.51265: Set connection var ansible_pipelining to False 30529 1726882679.51267: Set connection var ansible_shell_type to sh 30529 1726882679.51278: Set connection var ansible_timeout to 10 30529 1726882679.51281: Set connection var ansible_connection to ssh 30529 1726882679.51285: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882679.51314: variable 'ansible_shell_executable' from source: unknown 30529 1726882679.51318: variable 'ansible_connection' from source: unknown 30529 1726882679.51320: variable 'ansible_module_compression' from source: unknown 30529 1726882679.51323: variable 'ansible_shell_type' from source: unknown 30529 1726882679.51325: variable 'ansible_shell_executable' from source: unknown 30529 1726882679.51327: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882679.51329: variable 'ansible_pipelining' from source: unknown 30529 1726882679.51331: variable 'ansible_timeout' from source: unknown 30529 1726882679.51345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882679.51433: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882679.51454: variable 'omit' from source: magic vars 30529 1726882679.51457: starting attempt loop 30529 1726882679.51460: running the handler 30529 1726882679.51529: variable 'ansible_facts' from source: unknown 30529 1726882679.52106: _low_level_execute_command(): starting 30529 1726882679.52118: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882679.52595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882679.52600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882679.52603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882679.52605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882679.52640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882679.52643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.52712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.54402: stdout chunk (state=3): >>>/root <<< 30529 1726882679.54592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882679.54598: stdout chunk (state=3): >>><<< 30529 1726882679.54600: stderr chunk (state=3): >>><<< 30529 1726882679.54708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882679.54713: _low_level_execute_command(): starting 30529 1726882679.54717: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780 `" && echo ansible-tmp-1726882679.5462074-34900-181002071117780="` echo /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780 `" ) && sleep 0' 30529 1726882679.55316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882679.55329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882679.55349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882679.55459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882679.55509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.55543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.57420: stdout chunk (state=3): >>>ansible-tmp-1726882679.5462074-34900-181002071117780=/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780 <<< 30529 1726882679.57594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882679.57598: stdout chunk (state=3): >>><<< 30529 1726882679.57601: stderr chunk (state=3): >>><<< 30529 1726882679.57799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882679.5462074-34900-181002071117780=/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882679.57802: variable 'ansible_module_compression' from source: unknown 30529 1726882679.57805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882679.57807: variable 'ansible_facts' from source: unknown 30529 1726882679.58022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py 30529 1726882679.58277: Sending initial data 30529 1726882679.58281: Sent initial data (156 bytes) 30529 1726882679.58852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882679.58865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882679.58879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882679.58902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882679.58935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882679.59036: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882679.59059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.59139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.60658: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882679.60712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882679.60759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjtvc20t0 /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py <<< 30529 1726882679.60778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py" <<< 30529 1726882679.60823: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjtvc20t0" to remote "/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py" <<< 30529 1726882679.62439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882679.62479: stderr chunk (state=3): >>><<< 30529 1726882679.62499: stdout chunk (state=3): >>><<< 30529 1726882679.62534: done transferring module to remote 30529 1726882679.62598: _low_level_execute_command(): starting 30529 1726882679.62602: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/ /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py && sleep 0' 30529 1726882679.63238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882679.63305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882679.63377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882679.63445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.63496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.65184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882679.65211: stderr chunk (state=3): >>><<< 30529 1726882679.65215: stdout chunk (state=3): >>><<< 30529 1726882679.65227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882679.65230: _low_level_execute_command(): starting 30529 1726882679.65234: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/AnsiballZ_systemd.py && sleep 0' 30529 1726882679.65779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882679.65800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882679.65828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.65905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.94312: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10903552", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305013248", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1899591000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882679.94344: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882679.94363: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882679.96031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882679.96055: stderr chunk (state=3): >>><<< 30529 1726882679.96058: stdout chunk (state=3): >>><<< 30529 1726882679.96078: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10903552", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305013248", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1899591000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882679.96199: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882679.96218: _low_level_execute_command(): starting 30529 1726882679.96221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882679.5462074-34900-181002071117780/ > /dev/null 2>&1 && sleep 0' 30529 1726882679.96902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882679.96917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882679.96969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882679.96986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882679.97014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882679.97105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882679.98870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882679.98898: stderr chunk (state=3): >>><<< 30529 1726882679.98901: stdout chunk (state=3): >>><<< 30529 1726882679.98913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882679.98919: handler run complete 30529 1726882679.98955: attempt loop complete, returning result 30529 1726882679.98958: _execute() done 30529 1726882679.98960: dumping result to json 30529 1726882679.98975: done dumping result, returning 30529 1726882679.98983: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000001d37] 30529 1726882679.98997: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d37 30529 1726882679.99185: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d37 30529 1726882679.99190: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882679.99257: no more pending results, returning what we have 30529 1726882679.99260: results queue empty 30529 1726882679.99261: checking for any_errors_fatal 30529 1726882679.99268: done checking for any_errors_fatal 30529 1726882679.99269: checking for max_fail_percentage 30529 1726882679.99270: done checking for max_fail_percentage 30529 1726882679.99271: checking to see if all hosts have failed and the running result is not ok 30529 1726882679.99272: done checking to see if all hosts have failed 30529 1726882679.99272: getting the remaining hosts for this loop 30529 1726882679.99274: done getting the remaining hosts for this loop 30529 1726882679.99277: getting the next task for host managed_node1 30529 1726882679.99285: done getting next task for host managed_node1 30529 1726882679.99291: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882679.99298: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882679.99311: getting variables 30529 1726882679.99313: in VariableManager get_vars() 30529 1726882679.99377: Calling all_inventory to load vars for managed_node1 30529 1726882679.99380: Calling groups_inventory to load vars for managed_node1 30529 1726882679.99382: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882679.99401: Calling all_plugins_play to load vars for managed_node1 30529 1726882679.99405: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882679.99408: Calling groups_plugins_play to load vars for managed_node1 30529 1726882680.00245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882680.01520: done with get_vars() 30529 1726882680.01546: done getting variables 30529 1726882680.01623: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:00 -0400 (0:00:00.601) 0:01:34.042 ****** 30529 1726882680.01663: entering _queue_task() for managed_node1/service 30529 1726882680.01941: worker is 1 (out of 1 available) 30529 1726882680.01954: exiting _queue_task() for managed_node1/service 30529 1726882680.01967: done queuing things up, now waiting for results queue to drain 30529 1726882680.01969: waiting for pending results... 30529 1726882680.02156: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882680.02254: in run() - task 12673a56-9f93-b0f1-edc0-000000001d38 30529 1726882680.02265: variable 'ansible_search_path' from source: unknown 30529 1726882680.02268: variable 'ansible_search_path' from source: unknown 30529 1726882680.02330: calling self._execute() 30529 1726882680.02390: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.02397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.02409: variable 'omit' from source: magic vars 30529 1726882680.02877: variable 'ansible_distribution_major_version' from source: facts 30529 1726882680.02898: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882680.03036: variable 'network_provider' from source: set_fact 30529 1726882680.03056: Evaluated conditional (network_provider == "nm"): True 30529 1726882680.03299: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882680.03448: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882680.04006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882680.06294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882680.06343: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882680.06371: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882680.06402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882680.06424: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882680.06492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882680.06522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882680.06539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882680.06569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882680.06588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882680.06628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882680.06644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882680.06664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882680.06700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882680.06722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882680.06767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882680.06781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882680.06811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882680.06837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882680.06849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882680.06977: variable 'network_connections' from source: include params 30529 1726882680.06994: variable 'interface' from source: play vars 30529 1726882680.07063: variable 'interface' from source: play vars 30529 1726882680.07123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882680.07322: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882680.07357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882680.07404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882680.07421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882680.07452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882680.07474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882680.07501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882680.07518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882680.07568: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882680.07795: variable 'network_connections' from source: include params 30529 1726882680.07799: variable 'interface' from source: play vars 30529 1726882680.08098: variable 'interface' from source: play vars 30529 1726882680.08102: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882680.08104: when evaluation is False, skipping this task 30529 1726882680.08106: _execute() done 30529 1726882680.08107: dumping result to json 30529 1726882680.08110: done dumping result, returning 30529 1726882680.08112: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000001d38] 30529 1726882680.08120: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d38 30529 1726882680.08198: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d38 30529 1726882680.08202: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882680.08257: no more pending results, returning what we have 30529 1726882680.08260: results queue empty 30529 1726882680.08261: checking for any_errors_fatal 30529 1726882680.08277: done checking for any_errors_fatal 30529 1726882680.08278: checking for max_fail_percentage 30529 1726882680.08279: done checking for max_fail_percentage 30529 1726882680.08280: checking to see if all hosts have failed and the running result is not ok 30529 1726882680.08281: done checking to see if all hosts have failed 30529 1726882680.08282: getting the remaining hosts for this loop 30529 1726882680.08283: done getting the remaining hosts for this loop 30529 1726882680.08287: getting the next task for host managed_node1 30529 1726882680.08298: done getting next task for host managed_node1 30529 1726882680.08301: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882680.08306: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882680.08327: getting variables 30529 1726882680.08328: in VariableManager get_vars() 30529 1726882680.08367: Calling all_inventory to load vars for managed_node1 30529 1726882680.08370: Calling groups_inventory to load vars for managed_node1 30529 1726882680.08372: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882680.08381: Calling all_plugins_play to load vars for managed_node1 30529 1726882680.08383: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882680.08386: Calling groups_plugins_play to load vars for managed_node1 30529 1726882680.09495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882680.10372: done with get_vars() 30529 1726882680.10390: done getting variables 30529 1726882680.10466: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:00 -0400 (0:00:00.088) 0:01:34.131 ****** 30529 1726882680.10515: entering _queue_task() for managed_node1/service 30529 1726882680.10844: worker is 1 (out of 1 available) 30529 1726882680.10859: exiting _queue_task() for managed_node1/service 30529 1726882680.10872: done queuing things up, now waiting for results queue to drain 30529 1726882680.10874: waiting for pending results... 30529 1726882680.11133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882680.11274: in run() - task 12673a56-9f93-b0f1-edc0-000000001d39 30529 1726882680.11297: variable 'ansible_search_path' from source: unknown 30529 1726882680.11314: variable 'ansible_search_path' from source: unknown 30529 1726882680.11339: calling self._execute() 30529 1726882680.11421: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.11438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.11446: variable 'omit' from source: magic vars 30529 1726882680.11898: variable 'ansible_distribution_major_version' from source: facts 30529 1726882680.11902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882680.11975: variable 'network_provider' from source: set_fact 30529 1726882680.11987: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882680.11999: when evaluation is False, skipping this task 30529 1726882680.12012: _execute() done 30529 1726882680.12025: dumping result to json 30529 1726882680.12038: done dumping result, returning 30529 1726882680.12051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000001d39] 30529 1726882680.12068: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d39 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882680.12311: no more pending results, returning what we have 30529 1726882680.12314: results queue empty 30529 1726882680.12315: checking for any_errors_fatal 30529 1726882680.12324: done checking for any_errors_fatal 30529 1726882680.12325: checking for max_fail_percentage 30529 1726882680.12327: done checking for max_fail_percentage 30529 1726882680.12328: checking to see if all hosts have failed and the running result is not ok 30529 1726882680.12329: done checking to see if all hosts have failed 30529 1726882680.12329: getting the remaining hosts for this loop 30529 1726882680.12331: done getting the remaining hosts for this loop 30529 1726882680.12334: getting the next task for host managed_node1 30529 1726882680.12345: done getting next task for host managed_node1 30529 1726882680.12349: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882680.12354: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882680.12379: getting variables 30529 1726882680.12380: in VariableManager get_vars() 30529 1726882680.12566: Calling all_inventory to load vars for managed_node1 30529 1726882680.12568: Calling groups_inventory to load vars for managed_node1 30529 1726882680.12571: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882680.12580: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d39 30529 1726882680.12586: WORKER PROCESS EXITING 30529 1726882680.12598: Calling all_plugins_play to load vars for managed_node1 30529 1726882680.12605: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882680.12611: Calling groups_plugins_play to load vars for managed_node1 30529 1726882680.13958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882680.15757: done with get_vars() 30529 1726882680.15782: done getting variables 30529 1726882680.15849: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:00 -0400 (0:00:00.053) 0:01:34.184 ****** 30529 1726882680.15886: entering _queue_task() for managed_node1/copy 30529 1726882680.16301: worker is 1 (out of 1 available) 30529 1726882680.16316: exiting _queue_task() for managed_node1/copy 30529 1726882680.16334: done queuing things up, now waiting for results queue to drain 30529 1726882680.16338: waiting for pending results... 30529 1726882680.16655: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882680.16830: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3a 30529 1726882680.16848: variable 'ansible_search_path' from source: unknown 30529 1726882680.16855: variable 'ansible_search_path' from source: unknown 30529 1726882680.16903: calling self._execute() 30529 1726882680.17011: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.17027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.17043: variable 'omit' from source: magic vars 30529 1726882680.17484: variable 'ansible_distribution_major_version' from source: facts 30529 1726882680.17508: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882680.17644: variable 'network_provider' from source: set_fact 30529 1726882680.17656: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882680.17684: when evaluation is False, skipping this task 30529 1726882680.17689: _execute() done 30529 1726882680.17794: dumping result to json 30529 1726882680.17798: done dumping result, returning 30529 1726882680.17804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000001d3a] 30529 1726882680.17806: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3a 30529 1726882680.17902: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3a 30529 1726882680.17905: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882680.17963: no more pending results, returning what we have 30529 1726882680.17967: results queue empty 30529 1726882680.17968: checking for any_errors_fatal 30529 1726882680.17974: done checking for any_errors_fatal 30529 1726882680.17974: checking for max_fail_percentage 30529 1726882680.17977: done checking for max_fail_percentage 30529 1726882680.17978: checking to see if all hosts have failed and the running result is not ok 30529 1726882680.17979: done checking to see if all hosts have failed 30529 1726882680.17980: getting the remaining hosts for this loop 30529 1726882680.17982: done getting the remaining hosts for this loop 30529 1726882680.17986: getting the next task for host managed_node1 30529 1726882680.18003: done getting next task for host managed_node1 30529 1726882680.18009: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882680.18017: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882680.18050: getting variables 30529 1726882680.18053: in VariableManager get_vars() 30529 1726882680.18306: Calling all_inventory to load vars for managed_node1 30529 1726882680.18309: Calling groups_inventory to load vars for managed_node1 30529 1726882680.18312: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882680.18329: Calling all_plugins_play to load vars for managed_node1 30529 1726882680.18333: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882680.18337: Calling groups_plugins_play to load vars for managed_node1 30529 1726882680.19946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882680.21580: done with get_vars() 30529 1726882680.21612: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:00 -0400 (0:00:00.058) 0:01:34.243 ****** 30529 1726882680.21708: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882680.22117: worker is 1 (out of 1 available) 30529 1726882680.22136: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882680.22150: done queuing things up, now waiting for results queue to drain 30529 1726882680.22152: waiting for pending results... 30529 1726882680.22620: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882680.22711: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3b 30529 1726882680.22714: variable 'ansible_search_path' from source: unknown 30529 1726882680.22717: variable 'ansible_search_path' from source: unknown 30529 1726882680.22745: calling self._execute() 30529 1726882680.22867: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.22878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.22899: variable 'omit' from source: magic vars 30529 1726882680.23334: variable 'ansible_distribution_major_version' from source: facts 30529 1726882680.23398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882680.23402: variable 'omit' from source: magic vars 30529 1726882680.23455: variable 'omit' from source: magic vars 30529 1726882680.23629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882680.25912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882680.25994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882680.26198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882680.26201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882680.26203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882680.26205: variable 'network_provider' from source: set_fact 30529 1726882680.26331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882680.26367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882680.26405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882680.26454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882680.26473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882680.26558: variable 'omit' from source: magic vars 30529 1726882680.26675: variable 'omit' from source: magic vars 30529 1726882680.26783: variable 'network_connections' from source: include params 30529 1726882680.26806: variable 'interface' from source: play vars 30529 1726882680.26878: variable 'interface' from source: play vars 30529 1726882680.27039: variable 'omit' from source: magic vars 30529 1726882680.27053: variable '__lsr_ansible_managed' from source: task vars 30529 1726882680.27129: variable '__lsr_ansible_managed' from source: task vars 30529 1726882680.27339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882680.27562: Loaded config def from plugin (lookup/template) 30529 1726882680.27571: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882680.27613: File lookup term: get_ansible_managed.j2 30529 1726882680.27739: variable 'ansible_search_path' from source: unknown 30529 1726882680.27745: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882680.27751: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882680.27754: variable 'ansible_search_path' from source: unknown 30529 1726882680.48081: variable 'ansible_managed' from source: unknown 30529 1726882680.48247: variable 'omit' from source: magic vars 30529 1726882680.48275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882680.48303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882680.48311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882680.48327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882680.48341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882680.48367: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882680.48373: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.48376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.48600: Set connection var ansible_shell_executable to /bin/sh 30529 1726882680.48604: Set connection var ansible_pipelining to False 30529 1726882680.48606: Set connection var ansible_shell_type to sh 30529 1726882680.48608: Set connection var ansible_timeout to 10 30529 1726882680.48610: Set connection var ansible_connection to ssh 30529 1726882680.48612: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882680.48614: variable 'ansible_shell_executable' from source: unknown 30529 1726882680.48617: variable 'ansible_connection' from source: unknown 30529 1726882680.48619: variable 'ansible_module_compression' from source: unknown 30529 1726882680.48621: variable 'ansible_shell_type' from source: unknown 30529 1726882680.48623: variable 'ansible_shell_executable' from source: unknown 30529 1726882680.48626: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882680.48628: variable 'ansible_pipelining' from source: unknown 30529 1726882680.48630: variable 'ansible_timeout' from source: unknown 30529 1726882680.48632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882680.48682: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882680.48698: variable 'omit' from source: magic vars 30529 1726882680.48701: starting attempt loop 30529 1726882680.48708: running the handler 30529 1726882680.48712: _low_level_execute_command(): starting 30529 1726882680.48820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882680.49715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882680.49807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882680.49849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.49898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882680.51575: stdout chunk (state=3): >>>/root <<< 30529 1726882680.51720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882680.51731: stdout chunk (state=3): >>><<< 30529 1726882680.51744: stderr chunk (state=3): >>><<< 30529 1726882680.51869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882680.51873: _low_level_execute_command(): starting 30529 1726882680.51876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626 `" && echo ansible-tmp-1726882680.5177922-34931-213072354331626="` echo /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626 `" ) && sleep 0' 30529 1726882680.52486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882680.52526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882680.52637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882680.52664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.52735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882680.54628: stdout chunk (state=3): >>>ansible-tmp-1726882680.5177922-34931-213072354331626=/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626 <<< 30529 1726882680.54811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882680.54816: stdout chunk (state=3): >>><<< 30529 1726882680.54818: stderr chunk (state=3): >>><<< 30529 1726882680.54822: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882680.5177922-34931-213072354331626=/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882680.55030: variable 'ansible_module_compression' from source: unknown 30529 1726882680.55034: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882680.55067: variable 'ansible_facts' from source: unknown 30529 1726882680.55483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py 30529 1726882680.55629: Sending initial data 30529 1726882680.55710: Sent initial data (168 bytes) 30529 1726882680.56824: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882680.56839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882680.56850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882680.56911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882680.56923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.57017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882680.58495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882680.58535: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882680.58621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpegvlr8vv /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py <<< 30529 1726882680.58682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpegvlr8vv" to remote "/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py" <<< 30529 1726882680.60486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882680.60500: stdout chunk (state=3): >>><<< 30529 1726882680.60504: stderr chunk (state=3): >>><<< 30529 1726882680.60554: done transferring module to remote 30529 1726882680.60637: _low_level_execute_command(): starting 30529 1726882680.60640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/ /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py && sleep 0' 30529 1726882680.61402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882680.61409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882680.61423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.61485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882680.63299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882680.63303: stdout chunk (state=3): >>><<< 30529 1726882680.63305: stderr chunk (state=3): >>><<< 30529 1726882680.63318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882680.63321: _low_level_execute_command(): starting 30529 1726882680.63346: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/AnsiballZ_network_connections.py && sleep 0' 30529 1726882680.64443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882680.64450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882680.64462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882680.64475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882680.64489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882680.64494: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882680.64507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882680.64526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882680.64539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882680.64550: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882680.64553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882680.64562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882680.64607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882680.64646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882680.64663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882680.64696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.64734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882680.96055: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882680.98055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882680.98059: stdout chunk (state=3): >>><<< 30529 1726882680.98062: stderr chunk (state=3): >>><<< 30529 1726882680.98064: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882680.98082: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882680.98094: _low_level_execute_command(): starting 30529 1726882680.98165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882680.5177922-34931-213072354331626/ > /dev/null 2>&1 && sleep 0' 30529 1726882680.98854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882680.98907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882680.98913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882680.98975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882680.98979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882680.99031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.01081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.01085: stdout chunk (state=3): >>><<< 30529 1726882681.01086: stderr chunk (state=3): >>><<< 30529 1726882681.01091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882681.01094: handler run complete 30529 1726882681.01096: attempt loop complete, returning result 30529 1726882681.01098: _execute() done 30529 1726882681.01099: dumping result to json 30529 1726882681.01101: done dumping result, returning 30529 1726882681.01102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000001d3b] 30529 1726882681.01104: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3b 30529 1726882681.01172: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3b 30529 1726882681.01175: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30529 1726882681.01491: no more pending results, returning what we have 30529 1726882681.01497: results queue empty 30529 1726882681.01498: checking for any_errors_fatal 30529 1726882681.01503: done checking for any_errors_fatal 30529 1726882681.01504: checking for max_fail_percentage 30529 1726882681.01506: done checking for max_fail_percentage 30529 1726882681.01506: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.01512: done checking to see if all hosts have failed 30529 1726882681.01513: getting the remaining hosts for this loop 30529 1726882681.01514: done getting the remaining hosts for this loop 30529 1726882681.01517: getting the next task for host managed_node1 30529 1726882681.01525: done getting next task for host managed_node1 30529 1726882681.01528: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882681.01532: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.01543: getting variables 30529 1726882681.01545: in VariableManager get_vars() 30529 1726882681.01578: Calling all_inventory to load vars for managed_node1 30529 1726882681.01580: Calling groups_inventory to load vars for managed_node1 30529 1726882681.01582: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.01598: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.01602: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.01605: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.03841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.12032: done with get_vars() 30529 1726882681.12065: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:01 -0400 (0:00:00.904) 0:01:35.147 ****** 30529 1726882681.12141: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882681.12530: worker is 1 (out of 1 available) 30529 1726882681.12544: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882681.12558: done queuing things up, now waiting for results queue to drain 30529 1726882681.12561: waiting for pending results... 30529 1726882681.12943: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882681.13100: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3c 30529 1726882681.13105: variable 'ansible_search_path' from source: unknown 30529 1726882681.13109: variable 'ansible_search_path' from source: unknown 30529 1726882681.13141: calling self._execute() 30529 1726882681.13298: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.13303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.13307: variable 'omit' from source: magic vars 30529 1726882681.13730: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.13770: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.14100: variable 'network_state' from source: role '' defaults 30529 1726882681.14108: Evaluated conditional (network_state != {}): False 30529 1726882681.14112: when evaluation is False, skipping this task 30529 1726882681.14115: _execute() done 30529 1726882681.14118: dumping result to json 30529 1726882681.14120: done dumping result, returning 30529 1726882681.14123: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000001d3c] 30529 1726882681.14126: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3c 30529 1726882681.14213: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3c 30529 1726882681.14216: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882681.14272: no more pending results, returning what we have 30529 1726882681.14277: results queue empty 30529 1726882681.14278: checking for any_errors_fatal 30529 1726882681.14294: done checking for any_errors_fatal 30529 1726882681.14295: checking for max_fail_percentage 30529 1726882681.14298: done checking for max_fail_percentage 30529 1726882681.14299: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.14300: done checking to see if all hosts have failed 30529 1726882681.14300: getting the remaining hosts for this loop 30529 1726882681.14303: done getting the remaining hosts for this loop 30529 1726882681.14307: getting the next task for host managed_node1 30529 1726882681.14316: done getting next task for host managed_node1 30529 1726882681.14321: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882681.14328: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.14355: getting variables 30529 1726882681.14357: in VariableManager get_vars() 30529 1726882681.14528: Calling all_inventory to load vars for managed_node1 30529 1726882681.14532: Calling groups_inventory to load vars for managed_node1 30529 1726882681.14535: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.14547: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.14550: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.14553: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.16088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.17845: done with get_vars() 30529 1726882681.17873: done getting variables 30529 1726882681.17949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:01 -0400 (0:00:00.058) 0:01:35.205 ****** 30529 1726882681.17986: entering _queue_task() for managed_node1/debug 30529 1726882681.18513: worker is 1 (out of 1 available) 30529 1726882681.18526: exiting _queue_task() for managed_node1/debug 30529 1726882681.18538: done queuing things up, now waiting for results queue to drain 30529 1726882681.18540: waiting for pending results... 30529 1726882681.18678: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882681.18874: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3d 30529 1726882681.19022: variable 'ansible_search_path' from source: unknown 30529 1726882681.19026: variable 'ansible_search_path' from source: unknown 30529 1726882681.19029: calling self._execute() 30529 1726882681.19224: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.19229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.19245: variable 'omit' from source: magic vars 30529 1726882681.19695: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.19708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.19711: variable 'omit' from source: magic vars 30529 1726882681.19792: variable 'omit' from source: magic vars 30529 1726882681.19850: variable 'omit' from source: magic vars 30529 1726882681.19868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882681.19914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882681.19959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882681.19962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.19970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.20067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882681.20071: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.20073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.20136: Set connection var ansible_shell_executable to /bin/sh 30529 1726882681.20139: Set connection var ansible_pipelining to False 30529 1726882681.20142: Set connection var ansible_shell_type to sh 30529 1726882681.20158: Set connection var ansible_timeout to 10 30529 1726882681.20161: Set connection var ansible_connection to ssh 30529 1726882681.20166: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882681.20192: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.20197: variable 'ansible_connection' from source: unknown 30529 1726882681.20200: variable 'ansible_module_compression' from source: unknown 30529 1726882681.20202: variable 'ansible_shell_type' from source: unknown 30529 1726882681.20204: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.20206: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.20296: variable 'ansible_pipelining' from source: unknown 30529 1726882681.20299: variable 'ansible_timeout' from source: unknown 30529 1726882681.20301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.20377: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882681.20386: variable 'omit' from source: magic vars 30529 1726882681.20398: starting attempt loop 30529 1726882681.20401: running the handler 30529 1726882681.20584: variable '__network_connections_result' from source: set_fact 30529 1726882681.20596: handler run complete 30529 1726882681.20614: attempt loop complete, returning result 30529 1726882681.20617: _execute() done 30529 1726882681.20620: dumping result to json 30529 1726882681.20622: done dumping result, returning 30529 1726882681.20632: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001d3d] 30529 1726882681.20799: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3d 30529 1726882681.20873: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3d 30529 1726882681.20876: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30529 1726882681.21265: no more pending results, returning what we have 30529 1726882681.21268: results queue empty 30529 1726882681.21269: checking for any_errors_fatal 30529 1726882681.21274: done checking for any_errors_fatal 30529 1726882681.21274: checking for max_fail_percentage 30529 1726882681.21276: done checking for max_fail_percentage 30529 1726882681.21277: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.21278: done checking to see if all hosts have failed 30529 1726882681.21278: getting the remaining hosts for this loop 30529 1726882681.21280: done getting the remaining hosts for this loop 30529 1726882681.21283: getting the next task for host managed_node1 30529 1726882681.21291: done getting next task for host managed_node1 30529 1726882681.21297: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882681.21303: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.21314: getting variables 30529 1726882681.21316: in VariableManager get_vars() 30529 1726882681.21357: Calling all_inventory to load vars for managed_node1 30529 1726882681.21360: Calling groups_inventory to load vars for managed_node1 30529 1726882681.21362: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.21372: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.21375: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.21378: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.24867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.26971: done with get_vars() 30529 1726882681.27002: done getting variables 30529 1726882681.27108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:01 -0400 (0:00:00.091) 0:01:35.297 ****** 30529 1726882681.27155: entering _queue_task() for managed_node1/debug 30529 1726882681.27906: worker is 1 (out of 1 available) 30529 1726882681.27915: exiting _queue_task() for managed_node1/debug 30529 1726882681.27926: done queuing things up, now waiting for results queue to drain 30529 1726882681.27927: waiting for pending results... 30529 1726882681.28106: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882681.28202: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3e 30529 1726882681.28206: variable 'ansible_search_path' from source: unknown 30529 1726882681.28208: variable 'ansible_search_path' from source: unknown 30529 1726882681.28215: calling self._execute() 30529 1726882681.28333: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.28337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.28340: variable 'omit' from source: magic vars 30529 1726882681.28764: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.28767: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.28770: variable 'omit' from source: magic vars 30529 1726882681.28867: variable 'omit' from source: magic vars 30529 1726882681.28870: variable 'omit' from source: magic vars 30529 1726882681.28912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882681.29182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882681.29185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882681.29191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.29195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.29198: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882681.29201: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.29203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.29206: Set connection var ansible_shell_executable to /bin/sh 30529 1726882681.29208: Set connection var ansible_pipelining to False 30529 1726882681.29210: Set connection var ansible_shell_type to sh 30529 1726882681.29212: Set connection var ansible_timeout to 10 30529 1726882681.29215: Set connection var ansible_connection to ssh 30529 1726882681.29217: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882681.29219: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.29221: variable 'ansible_connection' from source: unknown 30529 1726882681.29223: variable 'ansible_module_compression' from source: unknown 30529 1726882681.29225: variable 'ansible_shell_type' from source: unknown 30529 1726882681.29227: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.29229: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.29231: variable 'ansible_pipelining' from source: unknown 30529 1726882681.29233: variable 'ansible_timeout' from source: unknown 30529 1726882681.29235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.29476: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882681.29486: variable 'omit' from source: magic vars 30529 1726882681.29492: starting attempt loop 30529 1726882681.29497: running the handler 30529 1726882681.29698: variable '__network_connections_result' from source: set_fact 30529 1726882681.29701: variable '__network_connections_result' from source: set_fact 30529 1726882681.29742: handler run complete 30529 1726882681.29764: attempt loop complete, returning result 30529 1726882681.29767: _execute() done 30529 1726882681.29770: dumping result to json 30529 1726882681.29772: done dumping result, returning 30529 1726882681.29782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-000000001d3e] 30529 1726882681.29795: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3e 30529 1726882681.29898: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3e 30529 1726882681.29901: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30529 1726882681.29996: no more pending results, returning what we have 30529 1726882681.30000: results queue empty 30529 1726882681.30002: checking for any_errors_fatal 30529 1726882681.30011: done checking for any_errors_fatal 30529 1726882681.30012: checking for max_fail_percentage 30529 1726882681.30013: done checking for max_fail_percentage 30529 1726882681.30015: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.30016: done checking to see if all hosts have failed 30529 1726882681.30017: getting the remaining hosts for this loop 30529 1726882681.30019: done getting the remaining hosts for this loop 30529 1726882681.30023: getting the next task for host managed_node1 30529 1726882681.30032: done getting next task for host managed_node1 30529 1726882681.30036: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882681.30041: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.30055: getting variables 30529 1726882681.30057: in VariableManager get_vars() 30529 1726882681.30208: Calling all_inventory to load vars for managed_node1 30529 1726882681.30212: Calling groups_inventory to load vars for managed_node1 30529 1726882681.30215: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.30234: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.30238: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.30241: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.31815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.33449: done with get_vars() 30529 1726882681.33484: done getting variables 30529 1726882681.33545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:01 -0400 (0:00:00.064) 0:01:35.361 ****** 30529 1726882681.33588: entering _queue_task() for managed_node1/debug 30529 1726882681.33987: worker is 1 (out of 1 available) 30529 1726882681.34116: exiting _queue_task() for managed_node1/debug 30529 1726882681.34128: done queuing things up, now waiting for results queue to drain 30529 1726882681.34130: waiting for pending results... 30529 1726882681.34810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882681.34817: in run() - task 12673a56-9f93-b0f1-edc0-000000001d3f 30529 1726882681.34820: variable 'ansible_search_path' from source: unknown 30529 1726882681.34823: variable 'ansible_search_path' from source: unknown 30529 1726882681.34825: calling self._execute() 30529 1726882681.34828: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.34831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.34833: variable 'omit' from source: magic vars 30529 1726882681.35067: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.35077: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.35212: variable 'network_state' from source: role '' defaults 30529 1726882681.35229: Evaluated conditional (network_state != {}): False 30529 1726882681.35233: when evaluation is False, skipping this task 30529 1726882681.35236: _execute() done 30529 1726882681.35240: dumping result to json 30529 1726882681.35242: done dumping result, returning 30529 1726882681.35248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-000000001d3f] 30529 1726882681.35254: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3f 30529 1726882681.35355: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d3f 30529 1726882681.35358: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882681.35442: no more pending results, returning what we have 30529 1726882681.35446: results queue empty 30529 1726882681.35447: checking for any_errors_fatal 30529 1726882681.35457: done checking for any_errors_fatal 30529 1726882681.35458: checking for max_fail_percentage 30529 1726882681.35459: done checking for max_fail_percentage 30529 1726882681.35460: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.35461: done checking to see if all hosts have failed 30529 1726882681.35462: getting the remaining hosts for this loop 30529 1726882681.35464: done getting the remaining hosts for this loop 30529 1726882681.35468: getting the next task for host managed_node1 30529 1726882681.35478: done getting next task for host managed_node1 30529 1726882681.35482: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882681.35487: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.35519: getting variables 30529 1726882681.35522: in VariableManager get_vars() 30529 1726882681.35571: Calling all_inventory to load vars for managed_node1 30529 1726882681.35574: Calling groups_inventory to load vars for managed_node1 30529 1726882681.35577: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.35590: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.35796: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.35802: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.37421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.39069: done with get_vars() 30529 1726882681.39091: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:01 -0400 (0:00:00.056) 0:01:35.418 ****** 30529 1726882681.39202: entering _queue_task() for managed_node1/ping 30529 1726882681.39555: worker is 1 (out of 1 available) 30529 1726882681.39567: exiting _queue_task() for managed_node1/ping 30529 1726882681.39578: done queuing things up, now waiting for results queue to drain 30529 1726882681.39580: waiting for pending results... 30529 1726882681.39884: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882681.40201: in run() - task 12673a56-9f93-b0f1-edc0-000000001d40 30529 1726882681.40206: variable 'ansible_search_path' from source: unknown 30529 1726882681.40209: variable 'ansible_search_path' from source: unknown 30529 1726882681.40212: calling self._execute() 30529 1726882681.40215: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.40217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.40219: variable 'omit' from source: magic vars 30529 1726882681.41028: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.41041: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.41047: variable 'omit' from source: magic vars 30529 1726882681.41120: variable 'omit' from source: magic vars 30529 1726882681.41154: variable 'omit' from source: magic vars 30529 1726882681.41196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882681.41409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882681.41431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882681.41459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.41472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882681.41729: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882681.41732: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.41735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.41799: Set connection var ansible_shell_executable to /bin/sh 30529 1726882681.41805: Set connection var ansible_pipelining to False 30529 1726882681.41808: Set connection var ansible_shell_type to sh 30529 1726882681.41818: Set connection var ansible_timeout to 10 30529 1726882681.41821: Set connection var ansible_connection to ssh 30529 1726882681.41826: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882681.41849: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.41852: variable 'ansible_connection' from source: unknown 30529 1726882681.41855: variable 'ansible_module_compression' from source: unknown 30529 1726882681.41858: variable 'ansible_shell_type' from source: unknown 30529 1726882681.41860: variable 'ansible_shell_executable' from source: unknown 30529 1726882681.41862: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.41867: variable 'ansible_pipelining' from source: unknown 30529 1726882681.41989: variable 'ansible_timeout' from source: unknown 30529 1726882681.41994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.42432: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882681.42698: variable 'omit' from source: magic vars 30529 1726882681.42703: starting attempt loop 30529 1726882681.42706: running the handler 30529 1726882681.42708: _low_level_execute_command(): starting 30529 1726882681.42711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882681.43967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882681.44247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.44251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.44300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.46002: stdout chunk (state=3): >>>/root <<< 30529 1726882681.46112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.46122: stdout chunk (state=3): >>><<< 30529 1726882681.46133: stderr chunk (state=3): >>><<< 30529 1726882681.46163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882681.46181: _low_level_execute_command(): starting 30529 1726882681.46184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633 `" && echo ansible-tmp-1726882681.4616215-34989-44186499956633="` echo /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633 `" ) && sleep 0' 30529 1726882681.47605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882681.47610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882681.47613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882681.47616: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882681.47627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882681.47877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.47885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.47928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.49940: stdout chunk (state=3): >>>ansible-tmp-1726882681.4616215-34989-44186499956633=/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633 <<< 30529 1726882681.49957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.50002: stderr chunk (state=3): >>><<< 30529 1726882681.50317: stdout chunk (state=3): >>><<< 30529 1726882681.50341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882681.4616215-34989-44186499956633=/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882681.50390: variable 'ansible_module_compression' from source: unknown 30529 1726882681.50518: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882681.50668: variable 'ansible_facts' from source: unknown 30529 1726882681.50935: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py 30529 1726882681.51409: Sending initial data 30529 1726882681.51412: Sent initial data (152 bytes) 30529 1726882681.52322: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882681.52340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882681.52426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882681.52429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882681.52432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882681.52435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882681.52513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882681.52534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882681.52537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.52657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.52729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.54352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882681.54475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882681.54514: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj3a7z6np /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py <<< 30529 1726882681.54517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py" <<< 30529 1726882681.54572: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj3a7z6np" to remote "/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py" <<< 30529 1726882681.55535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.55596: stderr chunk (state=3): >>><<< 30529 1726882681.55800: stdout chunk (state=3): >>><<< 30529 1726882681.55803: done transferring module to remote 30529 1726882681.55805: _low_level_execute_command(): starting 30529 1726882681.55808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/ /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py && sleep 0' 30529 1726882681.57112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882681.57329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882681.57353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.57368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.57514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.59484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.59505: stdout chunk (state=3): >>><<< 30529 1726882681.59700: stderr chunk (state=3): >>><<< 30529 1726882681.59704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882681.59707: _low_level_execute_command(): starting 30529 1726882681.59709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/AnsiballZ_ping.py && sleep 0' 30529 1726882681.60802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882681.60935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882681.60991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882681.61041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.61057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.61149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.76023: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882681.77401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882681.77698: stderr chunk (state=3): >>><<< 30529 1726882681.77702: stdout chunk (state=3): >>><<< 30529 1726882681.77705: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882681.77709: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882681.77712: _low_level_execute_command(): starting 30529 1726882681.77714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882681.4616215-34989-44186499956633/ > /dev/null 2>&1 && sleep 0' 30529 1726882681.78864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882681.78881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882681.78905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882681.78923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882681.79021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882681.79373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882681.79432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882681.81299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882681.81321: stdout chunk (state=3): >>><<< 30529 1726882681.81599: stderr chunk (state=3): >>><<< 30529 1726882681.81603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882681.81610: handler run complete 30529 1726882681.81612: attempt loop complete, returning result 30529 1726882681.81613: _execute() done 30529 1726882681.81616: dumping result to json 30529 1726882681.81618: done dumping result, returning 30529 1726882681.81620: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-000000001d40] 30529 1726882681.81622: sending task result for task 12673a56-9f93-b0f1-edc0-000000001d40 30529 1726882681.81691: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001d40 30529 1726882681.81697: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882681.81775: no more pending results, returning what we have 30529 1726882681.81780: results queue empty 30529 1726882681.81781: checking for any_errors_fatal 30529 1726882681.81791: done checking for any_errors_fatal 30529 1726882681.81792: checking for max_fail_percentage 30529 1726882681.81795: done checking for max_fail_percentage 30529 1726882681.81796: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.81798: done checking to see if all hosts have failed 30529 1726882681.81798: getting the remaining hosts for this loop 30529 1726882681.81800: done getting the remaining hosts for this loop 30529 1726882681.81805: getting the next task for host managed_node1 30529 1726882681.81819: done getting next task for host managed_node1 30529 1726882681.81821: ^ task is: TASK: meta (role_complete) 30529 1726882681.81827: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.81843: getting variables 30529 1726882681.81845: in VariableManager get_vars() 30529 1726882681.82106: Calling all_inventory to load vars for managed_node1 30529 1726882681.82109: Calling groups_inventory to load vars for managed_node1 30529 1726882681.82112: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.82125: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.82128: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.82131: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.84022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.85820: done with get_vars() 30529 1726882681.85843: done getting variables 30529 1726882681.85935: done queuing things up, now waiting for results queue to drain 30529 1726882681.85937: results queue empty 30529 1726882681.85938: checking for any_errors_fatal 30529 1726882681.85940: done checking for any_errors_fatal 30529 1726882681.85941: checking for max_fail_percentage 30529 1726882681.85942: done checking for max_fail_percentage 30529 1726882681.85942: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.85943: done checking to see if all hosts have failed 30529 1726882681.85944: getting the remaining hosts for this loop 30529 1726882681.85945: done getting the remaining hosts for this loop 30529 1726882681.85947: getting the next task for host managed_node1 30529 1726882681.85952: done getting next task for host managed_node1 30529 1726882681.85954: ^ task is: TASK: Asserts 30529 1726882681.85956: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.85959: getting variables 30529 1726882681.85960: in VariableManager get_vars() 30529 1726882681.85976: Calling all_inventory to load vars for managed_node1 30529 1726882681.85978: Calling groups_inventory to load vars for managed_node1 30529 1726882681.85980: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.85985: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.85987: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.85992: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.87187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.88782: done with get_vars() 30529 1726882681.88919: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:38:01 -0400 (0:00:00.497) 0:01:35.916 ****** 30529 1726882681.89106: entering _queue_task() for managed_node1/include_tasks 30529 1726882681.89696: worker is 1 (out of 1 available) 30529 1726882681.89707: exiting _queue_task() for managed_node1/include_tasks 30529 1726882681.89717: done queuing things up, now waiting for results queue to drain 30529 1726882681.89718: waiting for pending results... 30529 1726882681.90010: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882681.90047: in run() - task 12673a56-9f93-b0f1-edc0-000000001749 30529 1726882681.90071: variable 'ansible_search_path' from source: unknown 30529 1726882681.90080: variable 'ansible_search_path' from source: unknown 30529 1726882681.90135: variable 'lsr_assert' from source: include params 30529 1726882681.90381: variable 'lsr_assert' from source: include params 30529 1726882681.90435: variable 'omit' from source: magic vars 30529 1726882681.90573: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882681.90586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882681.90799: variable 'omit' from source: magic vars 30529 1726882681.90844: variable 'ansible_distribution_major_version' from source: facts 30529 1726882681.90860: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882681.90873: variable 'item' from source: unknown 30529 1726882681.90947: variable 'item' from source: unknown 30529 1726882681.90984: variable 'item' from source: unknown 30529 1726882681.91053: variable 'item' from source: unknown 30529 1726882681.91299: dumping result to json 30529 1726882681.91302: done dumping result, returning 30529 1726882681.91305: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-000000001749] 30529 1726882681.91308: sending task result for task 12673a56-9f93-b0f1-edc0-000000001749 30529 1726882681.91350: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001749 30529 1726882681.91353: WORKER PROCESS EXITING 30529 1726882681.91376: no more pending results, returning what we have 30529 1726882681.91381: in VariableManager get_vars() 30529 1726882681.91432: Calling all_inventory to load vars for managed_node1 30529 1726882681.91435: Calling groups_inventory to load vars for managed_node1 30529 1726882681.91439: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.91452: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.91455: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.91457: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.93350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882681.94870: done with get_vars() 30529 1726882681.94896: variable 'ansible_search_path' from source: unknown 30529 1726882681.94897: variable 'ansible_search_path' from source: unknown 30529 1726882681.94937: we have included files to process 30529 1726882681.94939: generating all_blocks data 30529 1726882681.94941: done generating all_blocks data 30529 1726882681.94947: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882681.94948: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882681.94951: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882681.95053: in VariableManager get_vars() 30529 1726882681.95074: done with get_vars() 30529 1726882681.95185: done processing included file 30529 1726882681.95190: iterating over new_blocks loaded from include file 30529 1726882681.95192: in VariableManager get_vars() 30529 1726882681.95210: done with get_vars() 30529 1726882681.95211: filtering new block on tags 30529 1726882681.95247: done filtering new block on tags 30529 1726882681.95250: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 => (item=tasks/assert_profile_absent.yml) 30529 1726882681.95254: extending task lists for all hosts with included blocks 30529 1726882681.96281: done extending task lists 30529 1726882681.96282: done processing included files 30529 1726882681.96283: results queue empty 30529 1726882681.96284: checking for any_errors_fatal 30529 1726882681.96285: done checking for any_errors_fatal 30529 1726882681.96286: checking for max_fail_percentage 30529 1726882681.96290: done checking for max_fail_percentage 30529 1726882681.96290: checking to see if all hosts have failed and the running result is not ok 30529 1726882681.96291: done checking to see if all hosts have failed 30529 1726882681.96292: getting the remaining hosts for this loop 30529 1726882681.96295: done getting the remaining hosts for this loop 30529 1726882681.96298: getting the next task for host managed_node1 30529 1726882681.96302: done getting next task for host managed_node1 30529 1726882681.96304: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882681.96307: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882681.96310: getting variables 30529 1726882681.96311: in VariableManager get_vars() 30529 1726882681.96322: Calling all_inventory to load vars for managed_node1 30529 1726882681.96324: Calling groups_inventory to load vars for managed_node1 30529 1726882681.96326: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882681.96332: Calling all_plugins_play to load vars for managed_node1 30529 1726882681.96334: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882681.96337: Calling groups_plugins_play to load vars for managed_node1 30529 1726882681.98540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.00562: done with get_vars() 30529 1726882682.00591: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:38:02 -0400 (0:00:00.115) 0:01:36.032 ****** 30529 1726882682.00674: entering _queue_task() for managed_node1/include_tasks 30529 1726882682.01241: worker is 1 (out of 1 available) 30529 1726882682.01253: exiting _queue_task() for managed_node1/include_tasks 30529 1726882682.01266: done queuing things up, now waiting for results queue to drain 30529 1726882682.01267: waiting for pending results... 30529 1726882682.01912: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882682.01917: in run() - task 12673a56-9f93-b0f1-edc0-000000001e99 30529 1726882682.02118: variable 'ansible_search_path' from source: unknown 30529 1726882682.02121: variable 'ansible_search_path' from source: unknown 30529 1726882682.02125: calling self._execute() 30529 1726882682.02267: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.02279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.02300: variable 'omit' from source: magic vars 30529 1726882682.03127: variable 'ansible_distribution_major_version' from source: facts 30529 1726882682.03146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882682.03317: _execute() done 30529 1726882682.03320: dumping result to json 30529 1726882682.03323: done dumping result, returning 30529 1726882682.03325: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-000000001e99] 30529 1726882682.03328: sending task result for task 12673a56-9f93-b0f1-edc0-000000001e99 30529 1726882682.03409: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001e99 30529 1726882682.03413: WORKER PROCESS EXITING 30529 1726882682.03448: no more pending results, returning what we have 30529 1726882682.03454: in VariableManager get_vars() 30529 1726882682.03509: Calling all_inventory to load vars for managed_node1 30529 1726882682.03512: Calling groups_inventory to load vars for managed_node1 30529 1726882682.03516: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882682.03530: Calling all_plugins_play to load vars for managed_node1 30529 1726882682.03534: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882682.03538: Calling groups_plugins_play to load vars for managed_node1 30529 1726882682.05723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.08516: done with get_vars() 30529 1726882682.08540: variable 'ansible_search_path' from source: unknown 30529 1726882682.08542: variable 'ansible_search_path' from source: unknown 30529 1726882682.08552: variable 'item' from source: include params 30529 1726882682.08673: variable 'item' from source: include params 30529 1726882682.08872: we have included files to process 30529 1726882682.08874: generating all_blocks data 30529 1726882682.08876: done generating all_blocks data 30529 1726882682.08877: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882682.08878: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882682.08880: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882682.09808: done processing included file 30529 1726882682.09810: iterating over new_blocks loaded from include file 30529 1726882682.09811: in VariableManager get_vars() 30529 1726882682.09831: done with get_vars() 30529 1726882682.09833: filtering new block on tags 30529 1726882682.09905: done filtering new block on tags 30529 1726882682.09908: in VariableManager get_vars() 30529 1726882682.09925: done with get_vars() 30529 1726882682.09927: filtering new block on tags 30529 1726882682.09980: done filtering new block on tags 30529 1726882682.09983: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882682.09991: extending task lists for all hosts with included blocks 30529 1726882682.10217: done extending task lists 30529 1726882682.10218: done processing included files 30529 1726882682.10219: results queue empty 30529 1726882682.10220: checking for any_errors_fatal 30529 1726882682.10223: done checking for any_errors_fatal 30529 1726882682.10224: checking for max_fail_percentage 30529 1726882682.10225: done checking for max_fail_percentage 30529 1726882682.10226: checking to see if all hosts have failed and the running result is not ok 30529 1726882682.10226: done checking to see if all hosts have failed 30529 1726882682.10227: getting the remaining hosts for this loop 30529 1726882682.10228: done getting the remaining hosts for this loop 30529 1726882682.10231: getting the next task for host managed_node1 30529 1726882682.10235: done getting next task for host managed_node1 30529 1726882682.10236: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882682.10240: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882682.10241: getting variables 30529 1726882682.10242: in VariableManager get_vars() 30529 1726882682.10251: Calling all_inventory to load vars for managed_node1 30529 1726882682.10253: Calling groups_inventory to load vars for managed_node1 30529 1726882682.10255: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882682.10261: Calling all_plugins_play to load vars for managed_node1 30529 1726882682.10263: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882682.10266: Calling groups_plugins_play to load vars for managed_node1 30529 1726882682.12325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.13926: done with get_vars() 30529 1726882682.13956: done getting variables 30529 1726882682.14005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:38:02 -0400 (0:00:00.133) 0:01:36.166 ****** 30529 1726882682.14037: entering _queue_task() for managed_node1/set_fact 30529 1726882682.14557: worker is 1 (out of 1 available) 30529 1726882682.14569: exiting _queue_task() for managed_node1/set_fact 30529 1726882682.14583: done queuing things up, now waiting for results queue to drain 30529 1726882682.14585: waiting for pending results... 30529 1726882682.15200: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882682.15302: in run() - task 12673a56-9f93-b0f1-edc0-000000001f17 30529 1726882682.15318: variable 'ansible_search_path' from source: unknown 30529 1726882682.15326: variable 'ansible_search_path' from source: unknown 30529 1726882682.15436: calling self._execute() 30529 1726882682.15472: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.15478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.15488: variable 'omit' from source: magic vars 30529 1726882682.15897: variable 'ansible_distribution_major_version' from source: facts 30529 1726882682.15908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882682.15915: variable 'omit' from source: magic vars 30529 1726882682.15973: variable 'omit' from source: magic vars 30529 1726882682.16013: variable 'omit' from source: magic vars 30529 1726882682.16080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882682.16102: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882682.16500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882682.16505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.16507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.16510: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882682.16512: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.16514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.16516: Set connection var ansible_shell_executable to /bin/sh 30529 1726882682.16518: Set connection var ansible_pipelining to False 30529 1726882682.16520: Set connection var ansible_shell_type to sh 30529 1726882682.16522: Set connection var ansible_timeout to 10 30529 1726882682.16524: Set connection var ansible_connection to ssh 30529 1726882682.16526: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882682.16528: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.16530: variable 'ansible_connection' from source: unknown 30529 1726882682.16532: variable 'ansible_module_compression' from source: unknown 30529 1726882682.16534: variable 'ansible_shell_type' from source: unknown 30529 1726882682.16536: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.16538: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.16541: variable 'ansible_pipelining' from source: unknown 30529 1726882682.16543: variable 'ansible_timeout' from source: unknown 30529 1726882682.16545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.16548: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882682.16550: variable 'omit' from source: magic vars 30529 1726882682.16552: starting attempt loop 30529 1726882682.16554: running the handler 30529 1726882682.16556: handler run complete 30529 1726882682.16558: attempt loop complete, returning result 30529 1726882682.16560: _execute() done 30529 1726882682.16562: dumping result to json 30529 1726882682.16564: done dumping result, returning 30529 1726882682.16566: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-000000001f17] 30529 1726882682.16568: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f17 30529 1726882682.16664: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f17 30529 1726882682.16667: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882682.16758: no more pending results, returning what we have 30529 1726882682.16761: results queue empty 30529 1726882682.16762: checking for any_errors_fatal 30529 1726882682.16764: done checking for any_errors_fatal 30529 1726882682.16765: checking for max_fail_percentage 30529 1726882682.16766: done checking for max_fail_percentage 30529 1726882682.16767: checking to see if all hosts have failed and the running result is not ok 30529 1726882682.16769: done checking to see if all hosts have failed 30529 1726882682.16769: getting the remaining hosts for this loop 30529 1726882682.16771: done getting the remaining hosts for this loop 30529 1726882682.16775: getting the next task for host managed_node1 30529 1726882682.16785: done getting next task for host managed_node1 30529 1726882682.16791: ^ task is: TASK: Stat profile file 30529 1726882682.16798: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882682.16803: getting variables 30529 1726882682.16806: in VariableManager get_vars() 30529 1726882682.16850: Calling all_inventory to load vars for managed_node1 30529 1726882682.16853: Calling groups_inventory to load vars for managed_node1 30529 1726882682.16857: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882682.16868: Calling all_plugins_play to load vars for managed_node1 30529 1726882682.16872: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882682.16875: Calling groups_plugins_play to load vars for managed_node1 30529 1726882682.18753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.20606: done with get_vars() 30529 1726882682.20630: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:38:02 -0400 (0:00:00.066) 0:01:36.233 ****** 30529 1726882682.20729: entering _queue_task() for managed_node1/stat 30529 1726882682.21077: worker is 1 (out of 1 available) 30529 1726882682.21091: exiting _queue_task() for managed_node1/stat 30529 1726882682.21109: done queuing things up, now waiting for results queue to drain 30529 1726882682.21111: waiting for pending results... 30529 1726882682.21590: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882682.22005: in run() - task 12673a56-9f93-b0f1-edc0-000000001f18 30529 1726882682.22010: variable 'ansible_search_path' from source: unknown 30529 1726882682.22013: variable 'ansible_search_path' from source: unknown 30529 1726882682.22017: calling self._execute() 30529 1726882682.22108: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.22298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.22302: variable 'omit' from source: magic vars 30529 1726882682.23044: variable 'ansible_distribution_major_version' from source: facts 30529 1726882682.23066: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882682.23079: variable 'omit' from source: magic vars 30529 1726882682.23148: variable 'omit' from source: magic vars 30529 1726882682.23319: variable 'profile' from source: play vars 30529 1726882682.23329: variable 'interface' from source: play vars 30529 1726882682.23397: variable 'interface' from source: play vars 30529 1726882682.23443: variable 'omit' from source: magic vars 30529 1726882682.23510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882682.23556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882682.24004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882682.24008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.24010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.24013: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882682.24016: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.24018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.24020: Set connection var ansible_shell_executable to /bin/sh 30529 1726882682.24022: Set connection var ansible_pipelining to False 30529 1726882682.24025: Set connection var ansible_shell_type to sh 30529 1726882682.24043: Set connection var ansible_timeout to 10 30529 1726882682.24050: Set connection var ansible_connection to ssh 30529 1726882682.24061: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882682.24086: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.24329: variable 'ansible_connection' from source: unknown 30529 1726882682.24333: variable 'ansible_module_compression' from source: unknown 30529 1726882682.24335: variable 'ansible_shell_type' from source: unknown 30529 1726882682.24337: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.24339: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.24341: variable 'ansible_pipelining' from source: unknown 30529 1726882682.24344: variable 'ansible_timeout' from source: unknown 30529 1726882682.24346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.24541: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882682.24673: variable 'omit' from source: magic vars 30529 1726882682.24770: starting attempt loop 30529 1726882682.24774: running the handler 30529 1726882682.24776: _low_level_execute_command(): starting 30529 1726882682.24780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882682.25791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.25809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882682.25823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882682.25870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882682.25890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882682.25974: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.26000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.26020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.26099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.27914: stdout chunk (state=3): >>>/root <<< 30529 1726882682.27918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.27920: stdout chunk (state=3): >>><<< 30529 1726882682.27922: stderr chunk (state=3): >>><<< 30529 1726882682.28057: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.28060: _low_level_execute_command(): starting 30529 1726882682.28064: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103 `" && echo ansible-tmp-1726882682.2796764-35040-16028956012103="` echo /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103 `" ) && sleep 0' 30529 1726882682.28628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.28643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882682.28657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882682.28683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882682.28706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882682.28809: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.28832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.28907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.30805: stdout chunk (state=3): >>>ansible-tmp-1726882682.2796764-35040-16028956012103=/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103 <<< 30529 1726882682.30970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.30973: stdout chunk (state=3): >>><<< 30529 1726882682.30976: stderr chunk (state=3): >>><<< 30529 1726882682.31200: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882682.2796764-35040-16028956012103=/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.31203: variable 'ansible_module_compression' from source: unknown 30529 1726882682.31205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882682.31208: variable 'ansible_facts' from source: unknown 30529 1726882682.31266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py 30529 1726882682.31472: Sending initial data 30529 1726882682.31476: Sent initial data (152 bytes) 30529 1726882682.32112: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.32119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.32132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.32272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.32352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.33912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882682.33963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882682.34007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmphvzdh0t9 /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py <<< 30529 1726882682.34021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py" <<< 30529 1726882682.34058: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmphvzdh0t9" to remote "/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py" <<< 30529 1726882682.34830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.34978: stderr chunk (state=3): >>><<< 30529 1726882682.34981: stdout chunk (state=3): >>><<< 30529 1726882682.34984: done transferring module to remote 30529 1726882682.34986: _low_level_execute_command(): starting 30529 1726882682.34988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/ /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py && sleep 0' 30529 1726882682.35719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.35831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.35874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.35917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.37697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.37718: stdout chunk (state=3): >>><<< 30529 1726882682.37723: stderr chunk (state=3): >>><<< 30529 1726882682.37818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.37821: _low_level_execute_command(): starting 30529 1726882682.37824: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/AnsiballZ_stat.py && sleep 0' 30529 1726882682.38452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.38456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.38515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.38534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.38566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.38646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.53642: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882682.54924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.55200: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882682.55209: stdout chunk (state=3): >>><<< 30529 1726882682.55229: stderr chunk (state=3): >>><<< 30529 1726882682.55302: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882682.55305: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882682.55317: _low_level_execute_command(): starting 30529 1726882682.55328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882682.2796764-35040-16028956012103/ > /dev/null 2>&1 && sleep 0' 30529 1726882682.56014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.56071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.56145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.56174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.56343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.56826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.58607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.58617: stdout chunk (state=3): >>><<< 30529 1726882682.58628: stderr chunk (state=3): >>><<< 30529 1726882682.58799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.58802: handler run complete 30529 1726882682.58805: attempt loop complete, returning result 30529 1726882682.58807: _execute() done 30529 1726882682.58810: dumping result to json 30529 1726882682.58812: done dumping result, returning 30529 1726882682.58814: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-000000001f18] 30529 1726882682.58817: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f18 30529 1726882682.58896: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f18 30529 1726882682.58901: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882682.58961: no more pending results, returning what we have 30529 1726882682.58964: results queue empty 30529 1726882682.58965: checking for any_errors_fatal 30529 1726882682.58976: done checking for any_errors_fatal 30529 1726882682.58977: checking for max_fail_percentage 30529 1726882682.58978: done checking for max_fail_percentage 30529 1726882682.58979: checking to see if all hosts have failed and the running result is not ok 30529 1726882682.58980: done checking to see if all hosts have failed 30529 1726882682.58981: getting the remaining hosts for this loop 30529 1726882682.58982: done getting the remaining hosts for this loop 30529 1726882682.58986: getting the next task for host managed_node1 30529 1726882682.58997: done getting next task for host managed_node1 30529 1726882682.58999: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882682.59005: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882682.59011: getting variables 30529 1726882682.59013: in VariableManager get_vars() 30529 1726882682.59053: Calling all_inventory to load vars for managed_node1 30529 1726882682.59055: Calling groups_inventory to load vars for managed_node1 30529 1726882682.59059: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882682.59218: Calling all_plugins_play to load vars for managed_node1 30529 1726882682.59222: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882682.59226: Calling groups_plugins_play to load vars for managed_node1 30529 1726882682.61818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.63670: done with get_vars() 30529 1726882682.63702: done getting variables 30529 1726882682.63775: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:38:02 -0400 (0:00:00.430) 0:01:36.664 ****** 30529 1726882682.63816: entering _queue_task() for managed_node1/set_fact 30529 1726882682.64231: worker is 1 (out of 1 available) 30529 1726882682.64245: exiting _queue_task() for managed_node1/set_fact 30529 1726882682.64257: done queuing things up, now waiting for results queue to drain 30529 1726882682.64258: waiting for pending results... 30529 1726882682.64591: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882682.64730: in run() - task 12673a56-9f93-b0f1-edc0-000000001f19 30529 1726882682.64749: variable 'ansible_search_path' from source: unknown 30529 1726882682.64753: variable 'ansible_search_path' from source: unknown 30529 1726882682.65100: calling self._execute() 30529 1726882682.65104: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.65107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.65110: variable 'omit' from source: magic vars 30529 1726882682.65422: variable 'ansible_distribution_major_version' from source: facts 30529 1726882682.65433: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882682.65677: variable 'profile_stat' from source: set_fact 30529 1726882682.65688: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882682.65700: when evaluation is False, skipping this task 30529 1726882682.65703: _execute() done 30529 1726882682.65707: dumping result to json 30529 1726882682.65833: done dumping result, returning 30529 1726882682.65839: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-000000001f19] 30529 1726882682.65845: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f19 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882682.66000: no more pending results, returning what we have 30529 1726882682.66004: results queue empty 30529 1726882682.66005: checking for any_errors_fatal 30529 1726882682.66019: done checking for any_errors_fatal 30529 1726882682.66020: checking for max_fail_percentage 30529 1726882682.66022: done checking for max_fail_percentage 30529 1726882682.66023: checking to see if all hosts have failed and the running result is not ok 30529 1726882682.66024: done checking to see if all hosts have failed 30529 1726882682.66024: getting the remaining hosts for this loop 30529 1726882682.66026: done getting the remaining hosts for this loop 30529 1726882682.66031: getting the next task for host managed_node1 30529 1726882682.66101: done getting next task for host managed_node1 30529 1726882682.66104: ^ task is: TASK: Get NM profile info 30529 1726882682.66109: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882682.66114: getting variables 30529 1726882682.66116: in VariableManager get_vars() 30529 1726882682.66267: Calling all_inventory to load vars for managed_node1 30529 1726882682.66270: Calling groups_inventory to load vars for managed_node1 30529 1726882682.66273: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882682.66284: Calling all_plugins_play to load vars for managed_node1 30529 1726882682.66286: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882682.66305: Calling groups_plugins_play to load vars for managed_node1 30529 1726882682.66824: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f19 30529 1726882682.66827: WORKER PROCESS EXITING 30529 1726882682.67904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882682.69648: done with get_vars() 30529 1726882682.69676: done getting variables 30529 1726882682.69741: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:38:02 -0400 (0:00:00.059) 0:01:36.723 ****** 30529 1726882682.69779: entering _queue_task() for managed_node1/shell 30529 1726882682.70324: worker is 1 (out of 1 available) 30529 1726882682.70335: exiting _queue_task() for managed_node1/shell 30529 1726882682.70345: done queuing things up, now waiting for results queue to drain 30529 1726882682.70346: waiting for pending results... 30529 1726882682.70638: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882682.70643: in run() - task 12673a56-9f93-b0f1-edc0-000000001f1a 30529 1726882682.70646: variable 'ansible_search_path' from source: unknown 30529 1726882682.70649: variable 'ansible_search_path' from source: unknown 30529 1726882682.70675: calling self._execute() 30529 1726882682.70779: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.70783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.70809: variable 'omit' from source: magic vars 30529 1726882682.71227: variable 'ansible_distribution_major_version' from source: facts 30529 1726882682.71248: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882682.71253: variable 'omit' from source: magic vars 30529 1726882682.71311: variable 'omit' from source: magic vars 30529 1726882682.71418: variable 'profile' from source: play vars 30529 1726882682.71421: variable 'interface' from source: play vars 30529 1726882682.71497: variable 'interface' from source: play vars 30529 1726882682.71613: variable 'omit' from source: magic vars 30529 1726882682.71618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882682.71621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882682.71623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882682.71639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.71652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882682.71696: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882682.71699: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.71702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.71820: Set connection var ansible_shell_executable to /bin/sh 30529 1726882682.71828: Set connection var ansible_pipelining to False 30529 1726882682.71830: Set connection var ansible_shell_type to sh 30529 1726882682.71838: Set connection var ansible_timeout to 10 30529 1726882682.71840: Set connection var ansible_connection to ssh 30529 1726882682.71845: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882682.71866: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.71869: variable 'ansible_connection' from source: unknown 30529 1726882682.71871: variable 'ansible_module_compression' from source: unknown 30529 1726882682.71873: variable 'ansible_shell_type' from source: unknown 30529 1726882682.71948: variable 'ansible_shell_executable' from source: unknown 30529 1726882682.72303: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882682.72307: variable 'ansible_pipelining' from source: unknown 30529 1726882682.72310: variable 'ansible_timeout' from source: unknown 30529 1726882682.72312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882682.72315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882682.72318: variable 'omit' from source: magic vars 30529 1726882682.72321: starting attempt loop 30529 1726882682.72323: running the handler 30529 1726882682.72326: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882682.72329: _low_level_execute_command(): starting 30529 1726882682.72332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882682.72989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.73037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.73048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.73066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.73206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.74856: stdout chunk (state=3): >>>/root <<< 30529 1726882682.75018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.75021: stdout chunk (state=3): >>><<< 30529 1726882682.75024: stderr chunk (state=3): >>><<< 30529 1726882682.75057: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.75171: _low_level_execute_command(): starting 30529 1726882682.75175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561 `" && echo ansible-tmp-1726882682.7506466-35086-86955145308561="` echo /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561 `" ) && sleep 0' 30529 1726882682.75740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.75755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882682.75779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882682.75809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882682.75862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882682.76028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.76043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.76111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.76151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.76191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.78073: stdout chunk (state=3): >>>ansible-tmp-1726882682.7506466-35086-86955145308561=/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561 <<< 30529 1726882682.78230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.78233: stdout chunk (state=3): >>><<< 30529 1726882682.78236: stderr chunk (state=3): >>><<< 30529 1726882682.78253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882682.7506466-35086-86955145308561=/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.78298: variable 'ansible_module_compression' from source: unknown 30529 1726882682.78398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882682.78417: variable 'ansible_facts' from source: unknown 30529 1726882682.78511: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py 30529 1726882682.78727: Sending initial data 30529 1726882682.78730: Sent initial data (155 bytes) 30529 1726882682.79386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.79450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.79468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.79507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.79642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.81201: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30529 1726882682.81216: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882682.81394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882682.81439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpav90x8yb /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py <<< 30529 1726882682.81447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py" <<< 30529 1726882682.81485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpav90x8yb" to remote "/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py" <<< 30529 1726882682.82297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.82399: stderr chunk (state=3): >>><<< 30529 1726882682.82402: stdout chunk (state=3): >>><<< 30529 1726882682.82405: done transferring module to remote 30529 1726882682.82407: _low_level_execute_command(): starting 30529 1726882682.82409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/ /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py && sleep 0' 30529 1726882682.83050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.83079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882682.83108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882682.83209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.83241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.83418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882682.85084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882682.85100: stdout chunk (state=3): >>><<< 30529 1726882682.85112: stderr chunk (state=3): >>><<< 30529 1726882682.85133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882682.85142: _low_level_execute_command(): starting 30529 1726882682.85151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/AnsiballZ_command.py && sleep 0' 30529 1726882682.85722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882682.85736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882682.85752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882682.85769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882682.85786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882682.85804: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882682.85820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.85839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882682.85912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882682.85941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882682.85957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882682.86068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882682.86244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.02942: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:38:03.012456", "end": "2024-09-20 21:38:03.028376", "delta": "0:00:00.015920", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882683.04416: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882683.04442: stderr chunk (state=3): >>><<< 30529 1726882683.04445: stdout chunk (state=3): >>><<< 30529 1726882683.04463: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:38:03.012456", "end": "2024-09-20 21:38:03.028376", "delta": "0:00:00.015920", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882683.04496: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882683.04503: _low_level_execute_command(): starting 30529 1726882683.04508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882682.7506466-35086-86955145308561/ > /dev/null 2>&1 && sleep 0' 30529 1726882683.04949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882683.04953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882683.04959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882683.04961: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882683.04963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882683.05012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882683.05016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.05063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.06871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.06899: stderr chunk (state=3): >>><<< 30529 1726882683.06904: stdout chunk (state=3): >>><<< 30529 1726882683.06919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882683.06922: handler run complete 30529 1726882683.06939: Evaluated conditional (False): False 30529 1726882683.06948: attempt loop complete, returning result 30529 1726882683.06951: _execute() done 30529 1726882683.06953: dumping result to json 30529 1726882683.06958: done dumping result, returning 30529 1726882683.06965: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-000000001f1a] 30529 1726882683.06969: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1a 30529 1726882683.07067: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1a 30529 1726882683.07069: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.015920", "end": "2024-09-20 21:38:03.028376", "rc": 1, "start": "2024-09-20 21:38:03.012456" } MSG: non-zero return code ...ignoring 30529 1726882683.07135: no more pending results, returning what we have 30529 1726882683.07138: results queue empty 30529 1726882683.07139: checking for any_errors_fatal 30529 1726882683.07146: done checking for any_errors_fatal 30529 1726882683.07147: checking for max_fail_percentage 30529 1726882683.07148: done checking for max_fail_percentage 30529 1726882683.07149: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.07150: done checking to see if all hosts have failed 30529 1726882683.07151: getting the remaining hosts for this loop 30529 1726882683.07152: done getting the remaining hosts for this loop 30529 1726882683.07156: getting the next task for host managed_node1 30529 1726882683.07164: done getting next task for host managed_node1 30529 1726882683.07167: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882683.07173: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.07177: getting variables 30529 1726882683.07179: in VariableManager get_vars() 30529 1726882683.07225: Calling all_inventory to load vars for managed_node1 30529 1726882683.07228: Calling groups_inventory to load vars for managed_node1 30529 1726882683.07231: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.07242: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.07245: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.07247: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.08225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.09100: done with get_vars() 30529 1726882683.09116: done getting variables 30529 1726882683.09160: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:38:03 -0400 (0:00:00.394) 0:01:37.117 ****** 30529 1726882683.09183: entering _queue_task() for managed_node1/set_fact 30529 1726882683.09422: worker is 1 (out of 1 available) 30529 1726882683.09438: exiting _queue_task() for managed_node1/set_fact 30529 1726882683.09451: done queuing things up, now waiting for results queue to drain 30529 1726882683.09453: waiting for pending results... 30529 1726882683.09778: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882683.09784: in run() - task 12673a56-9f93-b0f1-edc0-000000001f1b 30529 1726882683.09788: variable 'ansible_search_path' from source: unknown 30529 1726882683.09792: variable 'ansible_search_path' from source: unknown 30529 1726882683.09835: calling self._execute() 30529 1726882683.09922: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.10040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.10051: variable 'omit' from source: magic vars 30529 1726882683.10755: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.10766: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.11103: variable 'nm_profile_exists' from source: set_fact 30529 1726882683.11113: Evaluated conditional (nm_profile_exists.rc == 0): False 30529 1726882683.11116: when evaluation is False, skipping this task 30529 1726882683.11119: _execute() done 30529 1726882683.11121: dumping result to json 30529 1726882683.11124: done dumping result, returning 30529 1726882683.11171: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-000000001f1b] 30529 1726882683.11174: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1b 30529 1726882683.11237: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1b 30529 1726882683.11240: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30529 1726882683.11319: no more pending results, returning what we have 30529 1726882683.11322: results queue empty 30529 1726882683.11323: checking for any_errors_fatal 30529 1726882683.11331: done checking for any_errors_fatal 30529 1726882683.11332: checking for max_fail_percentage 30529 1726882683.11333: done checking for max_fail_percentage 30529 1726882683.11334: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.11335: done checking to see if all hosts have failed 30529 1726882683.11336: getting the remaining hosts for this loop 30529 1726882683.11337: done getting the remaining hosts for this loop 30529 1726882683.11341: getting the next task for host managed_node1 30529 1726882683.11352: done getting next task for host managed_node1 30529 1726882683.11355: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882683.11361: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.11365: getting variables 30529 1726882683.11367: in VariableManager get_vars() 30529 1726882683.11410: Calling all_inventory to load vars for managed_node1 30529 1726882683.11412: Calling groups_inventory to load vars for managed_node1 30529 1726882683.11416: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.11428: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.11431: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.11434: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.12981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.13941: done with get_vars() 30529 1726882683.13956: done getting variables 30529 1726882683.14000: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882683.14081: variable 'profile' from source: play vars 30529 1726882683.14084: variable 'interface' from source: play vars 30529 1726882683.14125: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:38:03 -0400 (0:00:00.049) 0:01:37.167 ****** 30529 1726882683.14150: entering _queue_task() for managed_node1/command 30529 1726882683.14385: worker is 1 (out of 1 available) 30529 1726882683.14398: exiting _queue_task() for managed_node1/command 30529 1726882683.14411: done queuing things up, now waiting for results queue to drain 30529 1726882683.14413: waiting for pending results... 30529 1726882683.14595: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882683.14675: in run() - task 12673a56-9f93-b0f1-edc0-000000001f1d 30529 1726882683.14687: variable 'ansible_search_path' from source: unknown 30529 1726882683.14691: variable 'ansible_search_path' from source: unknown 30529 1726882683.14721: calling self._execute() 30529 1726882683.14796: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.14800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.14809: variable 'omit' from source: magic vars 30529 1726882683.15066: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.15076: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.15161: variable 'profile_stat' from source: set_fact 30529 1726882683.15168: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882683.15171: when evaluation is False, skipping this task 30529 1726882683.15175: _execute() done 30529 1726882683.15179: dumping result to json 30529 1726882683.15182: done dumping result, returning 30529 1726882683.15185: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000001f1d] 30529 1726882683.15195: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1d 30529 1726882683.15274: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1d 30529 1726882683.15277: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882683.15341: no more pending results, returning what we have 30529 1726882683.15344: results queue empty 30529 1726882683.15345: checking for any_errors_fatal 30529 1726882683.15352: done checking for any_errors_fatal 30529 1726882683.15353: checking for max_fail_percentage 30529 1726882683.15354: done checking for max_fail_percentage 30529 1726882683.15355: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.15356: done checking to see if all hosts have failed 30529 1726882683.15357: getting the remaining hosts for this loop 30529 1726882683.15358: done getting the remaining hosts for this loop 30529 1726882683.15362: getting the next task for host managed_node1 30529 1726882683.15371: done getting next task for host managed_node1 30529 1726882683.15373: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882683.15377: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.15381: getting variables 30529 1726882683.15382: in VariableManager get_vars() 30529 1726882683.15417: Calling all_inventory to load vars for managed_node1 30529 1726882683.15420: Calling groups_inventory to load vars for managed_node1 30529 1726882683.15423: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.15434: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.15436: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.15439: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.16188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.17045: done with get_vars() 30529 1726882683.17059: done getting variables 30529 1726882683.17101: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882683.17170: variable 'profile' from source: play vars 30529 1726882683.17172: variable 'interface' from source: play vars 30529 1726882683.17210: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:38:03 -0400 (0:00:00.030) 0:01:37.198 ****** 30529 1726882683.17234: entering _queue_task() for managed_node1/set_fact 30529 1726882683.17441: worker is 1 (out of 1 available) 30529 1726882683.17453: exiting _queue_task() for managed_node1/set_fact 30529 1726882683.17465: done queuing things up, now waiting for results queue to drain 30529 1726882683.17467: waiting for pending results... 30529 1726882683.17635: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882683.17717: in run() - task 12673a56-9f93-b0f1-edc0-000000001f1e 30529 1726882683.17729: variable 'ansible_search_path' from source: unknown 30529 1726882683.17732: variable 'ansible_search_path' from source: unknown 30529 1726882683.17758: calling self._execute() 30529 1726882683.17832: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.17836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.17845: variable 'omit' from source: magic vars 30529 1726882683.18097: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.18107: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.18190: variable 'profile_stat' from source: set_fact 30529 1726882683.18201: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882683.18205: when evaluation is False, skipping this task 30529 1726882683.18208: _execute() done 30529 1726882683.18210: dumping result to json 30529 1726882683.18212: done dumping result, returning 30529 1726882683.18219: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000001f1e] 30529 1726882683.18223: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1e 30529 1726882683.18306: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1e 30529 1726882683.18310: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882683.18377: no more pending results, returning what we have 30529 1726882683.18380: results queue empty 30529 1726882683.18381: checking for any_errors_fatal 30529 1726882683.18387: done checking for any_errors_fatal 30529 1726882683.18387: checking for max_fail_percentage 30529 1726882683.18389: done checking for max_fail_percentage 30529 1726882683.18389: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.18390: done checking to see if all hosts have failed 30529 1726882683.18391: getting the remaining hosts for this loop 30529 1726882683.18392: done getting the remaining hosts for this loop 30529 1726882683.18397: getting the next task for host managed_node1 30529 1726882683.18404: done getting next task for host managed_node1 30529 1726882683.18407: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882683.18411: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.18415: getting variables 30529 1726882683.18417: in VariableManager get_vars() 30529 1726882683.18447: Calling all_inventory to load vars for managed_node1 30529 1726882683.18449: Calling groups_inventory to load vars for managed_node1 30529 1726882683.18451: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.18460: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.18463: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.18465: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.19365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.20209: done with get_vars() 30529 1726882683.20224: done getting variables 30529 1726882683.20263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882683.20334: variable 'profile' from source: play vars 30529 1726882683.20337: variable 'interface' from source: play vars 30529 1726882683.20372: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:38:03 -0400 (0:00:00.031) 0:01:37.230 ****** 30529 1726882683.20398: entering _queue_task() for managed_node1/command 30529 1726882683.20603: worker is 1 (out of 1 available) 30529 1726882683.20616: exiting _queue_task() for managed_node1/command 30529 1726882683.20628: done queuing things up, now waiting for results queue to drain 30529 1726882683.20630: waiting for pending results... 30529 1726882683.20804: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882683.20880: in run() - task 12673a56-9f93-b0f1-edc0-000000001f1f 30529 1726882683.20895: variable 'ansible_search_path' from source: unknown 30529 1726882683.20898: variable 'ansible_search_path' from source: unknown 30529 1726882683.20921: calling self._execute() 30529 1726882683.20995: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.20999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.21009: variable 'omit' from source: magic vars 30529 1726882683.21259: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.21268: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.21355: variable 'profile_stat' from source: set_fact 30529 1726882683.21364: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882683.21367: when evaluation is False, skipping this task 30529 1726882683.21370: _execute() done 30529 1726882683.21373: dumping result to json 30529 1726882683.21375: done dumping result, returning 30529 1726882683.21381: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000001f1f] 30529 1726882683.21385: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1f 30529 1726882683.21465: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f1f 30529 1726882683.21468: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882683.21545: no more pending results, returning what we have 30529 1726882683.21548: results queue empty 30529 1726882683.21548: checking for any_errors_fatal 30529 1726882683.21553: done checking for any_errors_fatal 30529 1726882683.21553: checking for max_fail_percentage 30529 1726882683.21555: done checking for max_fail_percentage 30529 1726882683.21556: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.21556: done checking to see if all hosts have failed 30529 1726882683.21557: getting the remaining hosts for this loop 30529 1726882683.21558: done getting the remaining hosts for this loop 30529 1726882683.21561: getting the next task for host managed_node1 30529 1726882683.21568: done getting next task for host managed_node1 30529 1726882683.21571: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882683.21575: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.21579: getting variables 30529 1726882683.21580: in VariableManager get_vars() 30529 1726882683.21615: Calling all_inventory to load vars for managed_node1 30529 1726882683.21617: Calling groups_inventory to load vars for managed_node1 30529 1726882683.21620: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.21628: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.21631: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.21633: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.22365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.23622: done with get_vars() 30529 1726882683.23642: done getting variables 30529 1726882683.23709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882683.23801: variable 'profile' from source: play vars 30529 1726882683.23804: variable 'interface' from source: play vars 30529 1726882683.23840: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:38:03 -0400 (0:00:00.034) 0:01:37.264 ****** 30529 1726882683.23862: entering _queue_task() for managed_node1/set_fact 30529 1726882683.24078: worker is 1 (out of 1 available) 30529 1726882683.24096: exiting _queue_task() for managed_node1/set_fact 30529 1726882683.24109: done queuing things up, now waiting for results queue to drain 30529 1726882683.24111: waiting for pending results... 30529 1726882683.24278: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882683.24361: in run() - task 12673a56-9f93-b0f1-edc0-000000001f20 30529 1726882683.24372: variable 'ansible_search_path' from source: unknown 30529 1726882683.24375: variable 'ansible_search_path' from source: unknown 30529 1726882683.24407: calling self._execute() 30529 1726882683.24482: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.24485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.24497: variable 'omit' from source: magic vars 30529 1726882683.24754: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.24764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.24852: variable 'profile_stat' from source: set_fact 30529 1726882683.24861: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882683.24865: when evaluation is False, skipping this task 30529 1726882683.24867: _execute() done 30529 1726882683.24870: dumping result to json 30529 1726882683.24872: done dumping result, returning 30529 1726882683.24884: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000001f20] 30529 1726882683.24886: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f20 30529 1726882683.24964: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f20 30529 1726882683.24967: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882683.25032: no more pending results, returning what we have 30529 1726882683.25036: results queue empty 30529 1726882683.25037: checking for any_errors_fatal 30529 1726882683.25041: done checking for any_errors_fatal 30529 1726882683.25042: checking for max_fail_percentage 30529 1726882683.25044: done checking for max_fail_percentage 30529 1726882683.25045: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.25045: done checking to see if all hosts have failed 30529 1726882683.25046: getting the remaining hosts for this loop 30529 1726882683.25048: done getting the remaining hosts for this loop 30529 1726882683.25051: getting the next task for host managed_node1 30529 1726882683.25059: done getting next task for host managed_node1 30529 1726882683.25061: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30529 1726882683.25065: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.25068: getting variables 30529 1726882683.25069: in VariableManager get_vars() 30529 1726882683.25104: Calling all_inventory to load vars for managed_node1 30529 1726882683.25107: Calling groups_inventory to load vars for managed_node1 30529 1726882683.25110: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.25119: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.25121: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.25124: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.31523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.32996: done with get_vars() 30529 1726882683.33022: done getting variables 30529 1726882683.33071: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882683.33166: variable 'profile' from source: play vars 30529 1726882683.33169: variable 'interface' from source: play vars 30529 1726882683.33228: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:38:03 -0400 (0:00:00.093) 0:01:37.358 ****** 30529 1726882683.33255: entering _queue_task() for managed_node1/assert 30529 1726882683.33608: worker is 1 (out of 1 available) 30529 1726882683.33620: exiting _queue_task() for managed_node1/assert 30529 1726882683.33634: done queuing things up, now waiting for results queue to drain 30529 1726882683.33637: waiting for pending results... 30529 1726882683.33857: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' 30529 1726882683.33971: in run() - task 12673a56-9f93-b0f1-edc0-000000001e9a 30529 1726882683.33985: variable 'ansible_search_path' from source: unknown 30529 1726882683.33989: variable 'ansible_search_path' from source: unknown 30529 1726882683.34030: calling self._execute() 30529 1726882683.34168: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.34172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.34175: variable 'omit' from source: magic vars 30529 1726882683.34505: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.34517: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.34524: variable 'omit' from source: magic vars 30529 1726882683.34570: variable 'omit' from source: magic vars 30529 1726882683.34698: variable 'profile' from source: play vars 30529 1726882683.34704: variable 'interface' from source: play vars 30529 1726882683.34744: variable 'interface' from source: play vars 30529 1726882683.34763: variable 'omit' from source: magic vars 30529 1726882683.34821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882683.34839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882683.34859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882683.34930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882683.34933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882683.34936: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882683.34940: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.34942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.35028: Set connection var ansible_shell_executable to /bin/sh 30529 1726882683.35034: Set connection var ansible_pipelining to False 30529 1726882683.35040: Set connection var ansible_shell_type to sh 30529 1726882683.35043: Set connection var ansible_timeout to 10 30529 1726882683.35045: Set connection var ansible_connection to ssh 30529 1726882683.35051: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882683.35073: variable 'ansible_shell_executable' from source: unknown 30529 1726882683.35076: variable 'ansible_connection' from source: unknown 30529 1726882683.35078: variable 'ansible_module_compression' from source: unknown 30529 1726882683.35081: variable 'ansible_shell_type' from source: unknown 30529 1726882683.35083: variable 'ansible_shell_executable' from source: unknown 30529 1726882683.35085: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.35087: variable 'ansible_pipelining' from source: unknown 30529 1726882683.35150: variable 'ansible_timeout' from source: unknown 30529 1726882683.35154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.35237: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882683.35248: variable 'omit' from source: magic vars 30529 1726882683.35258: starting attempt loop 30529 1726882683.35261: running the handler 30529 1726882683.35368: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882683.35372: Evaluated conditional (not lsr_net_profile_exists): True 30529 1726882683.35374: handler run complete 30529 1726882683.35397: attempt loop complete, returning result 30529 1726882683.35400: _execute() done 30529 1726882683.35403: dumping result to json 30529 1726882683.35406: done dumping result, returning 30529 1726882683.35409: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' [12673a56-9f93-b0f1-edc0-000000001e9a] 30529 1726882683.35411: sending task result for task 12673a56-9f93-b0f1-edc0-000000001e9a 30529 1726882683.35540: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001e9a 30529 1726882683.35543: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882683.35626: no more pending results, returning what we have 30529 1726882683.35628: results queue empty 30529 1726882683.35629: checking for any_errors_fatal 30529 1726882683.35635: done checking for any_errors_fatal 30529 1726882683.35636: checking for max_fail_percentage 30529 1726882683.35637: done checking for max_fail_percentage 30529 1726882683.35638: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.35639: done checking to see if all hosts have failed 30529 1726882683.35640: getting the remaining hosts for this loop 30529 1726882683.35641: done getting the remaining hosts for this loop 30529 1726882683.35644: getting the next task for host managed_node1 30529 1726882683.35652: done getting next task for host managed_node1 30529 1726882683.35655: ^ task is: TASK: Conditional asserts 30529 1726882683.35657: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.35661: getting variables 30529 1726882683.35663: in VariableManager get_vars() 30529 1726882683.35699: Calling all_inventory to load vars for managed_node1 30529 1726882683.35702: Calling groups_inventory to load vars for managed_node1 30529 1726882683.35705: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.35715: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.35717: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.35720: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.37025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.38554: done with get_vars() 30529 1726882683.38580: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:38:03 -0400 (0:00:00.054) 0:01:37.412 ****** 30529 1726882683.38676: entering _queue_task() for managed_node1/include_tasks 30529 1726882683.39025: worker is 1 (out of 1 available) 30529 1726882683.39036: exiting _queue_task() for managed_node1/include_tasks 30529 1726882683.39051: done queuing things up, now waiting for results queue to drain 30529 1726882683.39052: waiting for pending results... 30529 1726882683.39420: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882683.39471: in run() - task 12673a56-9f93-b0f1-edc0-00000000174a 30529 1726882683.39491: variable 'ansible_search_path' from source: unknown 30529 1726882683.39505: variable 'ansible_search_path' from source: unknown 30529 1726882683.39899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882683.42396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882683.42466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882683.42513: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882683.42551: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882683.42579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882683.42669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882683.42729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882683.42741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882683.42784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882683.42806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882683.42946: variable 'lsr_assert_when' from source: include params 30529 1726882683.43031: variable 'network_provider' from source: set_fact 30529 1726882683.43111: variable 'omit' from source: magic vars 30529 1726882683.43217: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.43231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.43272: variable 'omit' from source: magic vars 30529 1726882683.43446: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.43461: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.43582: variable 'item' from source: unknown 30529 1726882683.43698: Evaluated conditional (item['condition']): True 30529 1726882683.43703: variable 'item' from source: unknown 30529 1726882683.43723: variable 'item' from source: unknown 30529 1726882683.43789: variable 'item' from source: unknown 30529 1726882683.44200: dumping result to json 30529 1726882683.44204: done dumping result, returning 30529 1726882683.44207: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-00000000174a] 30529 1726882683.44209: sending task result for task 12673a56-9f93-b0f1-edc0-00000000174a 30529 1726882683.44255: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000174a 30529 1726882683.44258: WORKER PROCESS EXITING 30529 1726882683.44284: no more pending results, returning what we have 30529 1726882683.44289: in VariableManager get_vars() 30529 1726882683.44340: Calling all_inventory to load vars for managed_node1 30529 1726882683.44343: Calling groups_inventory to load vars for managed_node1 30529 1726882683.44347: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.44357: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.44361: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.44364: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.46024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.47510: done with get_vars() 30529 1726882683.47529: variable 'ansible_search_path' from source: unknown 30529 1726882683.47531: variable 'ansible_search_path' from source: unknown 30529 1726882683.47570: we have included files to process 30529 1726882683.47571: generating all_blocks data 30529 1726882683.47573: done generating all_blocks data 30529 1726882683.47579: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882683.47580: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882683.47582: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882683.47689: in VariableManager get_vars() 30529 1726882683.47714: done with get_vars() 30529 1726882683.47825: done processing included file 30529 1726882683.47827: iterating over new_blocks loaded from include file 30529 1726882683.47829: in VariableManager get_vars() 30529 1726882683.47845: done with get_vars() 30529 1726882683.47847: filtering new block on tags 30529 1726882683.47883: done filtering new block on tags 30529 1726882683.47885: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30529 1726882683.47890: extending task lists for all hosts with included blocks 30529 1726882683.49083: done extending task lists 30529 1726882683.49085: done processing included files 30529 1726882683.49086: results queue empty 30529 1726882683.49086: checking for any_errors_fatal 30529 1726882683.49089: done checking for any_errors_fatal 30529 1726882683.49090: checking for max_fail_percentage 30529 1726882683.49091: done checking for max_fail_percentage 30529 1726882683.49092: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.49095: done checking to see if all hosts have failed 30529 1726882683.49095: getting the remaining hosts for this loop 30529 1726882683.49097: done getting the remaining hosts for this loop 30529 1726882683.49099: getting the next task for host managed_node1 30529 1726882683.49104: done getting next task for host managed_node1 30529 1726882683.49106: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882683.49109: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.49118: getting variables 30529 1726882683.49119: in VariableManager get_vars() 30529 1726882683.49130: Calling all_inventory to load vars for managed_node1 30529 1726882683.49132: Calling groups_inventory to load vars for managed_node1 30529 1726882683.49135: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.49140: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.49143: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.49145: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.50299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.51875: done with get_vars() 30529 1726882683.51898: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:38:03 -0400 (0:00:00.132) 0:01:37.545 ****** 30529 1726882683.51976: entering _queue_task() for managed_node1/include_tasks 30529 1726882683.52344: worker is 1 (out of 1 available) 30529 1726882683.52358: exiting _queue_task() for managed_node1/include_tasks 30529 1726882683.52371: done queuing things up, now waiting for results queue to drain 30529 1726882683.52373: waiting for pending results... 30529 1726882683.52660: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882683.52819: in run() - task 12673a56-9f93-b0f1-edc0-000000001f59 30529 1726882683.52823: variable 'ansible_search_path' from source: unknown 30529 1726882683.52826: variable 'ansible_search_path' from source: unknown 30529 1726882683.52860: calling self._execute() 30529 1726882683.52962: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.53199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.53204: variable 'omit' from source: magic vars 30529 1726882683.53375: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.53396: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.53409: _execute() done 30529 1726882683.53418: dumping result to json 30529 1726882683.53431: done dumping result, returning 30529 1726882683.53442: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-000000001f59] 30529 1726882683.53452: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f59 30529 1726882683.53579: no more pending results, returning what we have 30529 1726882683.53585: in VariableManager get_vars() 30529 1726882683.53636: Calling all_inventory to load vars for managed_node1 30529 1726882683.53639: Calling groups_inventory to load vars for managed_node1 30529 1726882683.53643: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.53657: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.53661: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.53664: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.54436: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f59 30529 1726882683.54439: WORKER PROCESS EXITING 30529 1726882683.55302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.56792: done with get_vars() 30529 1726882683.56818: variable 'ansible_search_path' from source: unknown 30529 1726882683.56820: variable 'ansible_search_path' from source: unknown 30529 1726882683.56947: variable 'item' from source: include params 30529 1726882683.56971: we have included files to process 30529 1726882683.56971: generating all_blocks data 30529 1726882683.56973: done generating all_blocks data 30529 1726882683.56974: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882683.56975: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882683.56976: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882683.57110: done processing included file 30529 1726882683.57112: iterating over new_blocks loaded from include file 30529 1726882683.57113: in VariableManager get_vars() 30529 1726882683.57125: done with get_vars() 30529 1726882683.57127: filtering new block on tags 30529 1726882683.57143: done filtering new block on tags 30529 1726882683.57144: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882683.57148: extending task lists for all hosts with included blocks 30529 1726882683.57241: done extending task lists 30529 1726882683.57242: done processing included files 30529 1726882683.57243: results queue empty 30529 1726882683.57243: checking for any_errors_fatal 30529 1726882683.57246: done checking for any_errors_fatal 30529 1726882683.57247: checking for max_fail_percentage 30529 1726882683.57247: done checking for max_fail_percentage 30529 1726882683.57248: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.57248: done checking to see if all hosts have failed 30529 1726882683.57249: getting the remaining hosts for this loop 30529 1726882683.57250: done getting the remaining hosts for this loop 30529 1726882683.57251: getting the next task for host managed_node1 30529 1726882683.57254: done getting next task for host managed_node1 30529 1726882683.57256: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882683.57259: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.57260: getting variables 30529 1726882683.57261: in VariableManager get_vars() 30529 1726882683.57268: Calling all_inventory to load vars for managed_node1 30529 1726882683.57270: Calling groups_inventory to load vars for managed_node1 30529 1726882683.57271: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.57275: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.57277: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.57278: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.58944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882683.60677: done with get_vars() 30529 1726882683.60706: done getting variables 30529 1726882683.60956: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:03 -0400 (0:00:00.090) 0:01:37.635 ****** 30529 1726882683.60990: entering _queue_task() for managed_node1/stat 30529 1726882683.61427: worker is 1 (out of 1 available) 30529 1726882683.61509: exiting _queue_task() for managed_node1/stat 30529 1726882683.61522: done queuing things up, now waiting for results queue to drain 30529 1726882683.61524: waiting for pending results... 30529 1726882683.61795: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882683.61947: in run() - task 12673a56-9f93-b0f1-edc0-000000001fe8 30529 1726882683.61960: variable 'ansible_search_path' from source: unknown 30529 1726882683.61964: variable 'ansible_search_path' from source: unknown 30529 1726882683.62009: calling self._execute() 30529 1726882683.62114: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.62121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.62137: variable 'omit' from source: magic vars 30529 1726882683.62612: variable 'ansible_distribution_major_version' from source: facts 30529 1726882683.62659: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882683.62662: variable 'omit' from source: magic vars 30529 1726882683.62802: variable 'omit' from source: magic vars 30529 1726882683.62813: variable 'interface' from source: play vars 30529 1726882683.62834: variable 'omit' from source: magic vars 30529 1726882683.62881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882683.62930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882683.62952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882683.62976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882683.62999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882683.63026: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882683.63029: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.63032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.63298: Set connection var ansible_shell_executable to /bin/sh 30529 1726882683.63302: Set connection var ansible_pipelining to False 30529 1726882683.63305: Set connection var ansible_shell_type to sh 30529 1726882683.63308: Set connection var ansible_timeout to 10 30529 1726882683.63311: Set connection var ansible_connection to ssh 30529 1726882683.63314: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882683.63317: variable 'ansible_shell_executable' from source: unknown 30529 1726882683.63320: variable 'ansible_connection' from source: unknown 30529 1726882683.63323: variable 'ansible_module_compression' from source: unknown 30529 1726882683.63326: variable 'ansible_shell_type' from source: unknown 30529 1726882683.63329: variable 'ansible_shell_executable' from source: unknown 30529 1726882683.63331: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882683.63334: variable 'ansible_pipelining' from source: unknown 30529 1726882683.63337: variable 'ansible_timeout' from source: unknown 30529 1726882683.63339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882683.63474: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882683.63510: variable 'omit' from source: magic vars 30529 1726882683.63513: starting attempt loop 30529 1726882683.63516: running the handler 30529 1726882683.63518: _low_level_execute_command(): starting 30529 1726882683.63533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882683.64716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882683.64783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882683.64818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.64897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.66762: stdout chunk (state=3): >>>/root <<< 30529 1726882683.66765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.66769: stdout chunk (state=3): >>><<< 30529 1726882683.66776: stderr chunk (state=3): >>><<< 30529 1726882683.66802: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882683.66817: _low_level_execute_command(): starting 30529 1726882683.66824: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394 `" && echo ansible-tmp-1726882683.6680288-35158-255289170097394="` echo /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394 `" ) && sleep 0' 30529 1726882683.67572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882683.67703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882683.67747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882683.67750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882683.67898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882683.68071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.68104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.69965: stdout chunk (state=3): >>>ansible-tmp-1726882683.6680288-35158-255289170097394=/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394 <<< 30529 1726882683.70199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.70202: stdout chunk (state=3): >>><<< 30529 1726882683.70205: stderr chunk (state=3): >>><<< 30529 1726882683.70208: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882683.6680288-35158-255289170097394=/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882683.70211: variable 'ansible_module_compression' from source: unknown 30529 1726882683.70238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882683.70276: variable 'ansible_facts' from source: unknown 30529 1726882683.70379: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py 30529 1726882683.70677: Sending initial data 30529 1726882683.70680: Sent initial data (153 bytes) 30529 1726882683.71567: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882683.71745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.71820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.73370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882683.73405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882683.73465: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp118jh6yo /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py <<< 30529 1726882683.73485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py" <<< 30529 1726882683.73533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp118jh6yo" to remote "/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py" <<< 30529 1726882683.74356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.74359: stdout chunk (state=3): >>><<< 30529 1726882683.74361: stderr chunk (state=3): >>><<< 30529 1726882683.74370: done transferring module to remote 30529 1726882683.74384: _low_level_execute_command(): starting 30529 1726882683.74395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/ /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py && sleep 0' 30529 1726882683.75026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882683.75110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882683.75150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882683.75165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882683.75187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.75260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.77117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.77132: stdout chunk (state=3): >>><<< 30529 1726882683.77145: stderr chunk (state=3): >>><<< 30529 1726882683.77238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882683.77245: _low_level_execute_command(): starting 30529 1726882683.77248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/AnsiballZ_stat.py && sleep 0' 30529 1726882683.77787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882683.77804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882683.77822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882683.77842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882683.77942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882683.77959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882683.77981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.78058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.93334: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882683.94355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.94405: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882683.94450: stderr chunk (state=3): >>><<< 30529 1726882683.94458: stdout chunk (state=3): >>><<< 30529 1726882683.94500: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882683.94523: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882683.94539: _low_level_execute_command(): starting 30529 1726882683.94603: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882683.6680288-35158-255289170097394/ > /dev/null 2>&1 && sleep 0' 30529 1726882683.95199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882683.95212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882683.95228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882683.95265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882683.95284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882683.95378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882683.95395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882683.95417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882683.95495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882683.97327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882683.97336: stdout chunk (state=3): >>><<< 30529 1726882683.97352: stderr chunk (state=3): >>><<< 30529 1726882683.97371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882683.97386: handler run complete 30529 1726882683.97413: attempt loop complete, returning result 30529 1726882683.97599: _execute() done 30529 1726882683.97602: dumping result to json 30529 1726882683.97604: done dumping result, returning 30529 1726882683.97606: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000001fe8] 30529 1726882683.97608: sending task result for task 12673a56-9f93-b0f1-edc0-000000001fe8 30529 1726882683.97685: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001fe8 30529 1726882683.97688: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882683.97753: no more pending results, returning what we have 30529 1726882683.97756: results queue empty 30529 1726882683.97757: checking for any_errors_fatal 30529 1726882683.97759: done checking for any_errors_fatal 30529 1726882683.97759: checking for max_fail_percentage 30529 1726882683.97761: done checking for max_fail_percentage 30529 1726882683.97762: checking to see if all hosts have failed and the running result is not ok 30529 1726882683.97763: done checking to see if all hosts have failed 30529 1726882683.97764: getting the remaining hosts for this loop 30529 1726882683.97766: done getting the remaining hosts for this loop 30529 1726882683.97770: getting the next task for host managed_node1 30529 1726882683.97782: done getting next task for host managed_node1 30529 1726882683.97784: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30529 1726882683.97789: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882683.97797: getting variables 30529 1726882683.97799: in VariableManager get_vars() 30529 1726882683.97840: Calling all_inventory to load vars for managed_node1 30529 1726882683.97843: Calling groups_inventory to load vars for managed_node1 30529 1726882683.97846: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882683.97858: Calling all_plugins_play to load vars for managed_node1 30529 1726882683.97862: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882683.97865: Calling groups_plugins_play to load vars for managed_node1 30529 1726882683.99513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.01173: done with get_vars() 30529 1726882684.01203: done getting variables 30529 1726882684.01269: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882684.01392: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:38:04 -0400 (0:00:00.404) 0:01:38.040 ****** 30529 1726882684.01427: entering _queue_task() for managed_node1/assert 30529 1726882684.02006: worker is 1 (out of 1 available) 30529 1726882684.02018: exiting _queue_task() for managed_node1/assert 30529 1726882684.02028: done queuing things up, now waiting for results queue to drain 30529 1726882684.02029: waiting for pending results... 30529 1726882684.02211: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' 30529 1726882684.02258: in run() - task 12673a56-9f93-b0f1-edc0-000000001f5a 30529 1726882684.02279: variable 'ansible_search_path' from source: unknown 30529 1726882684.02287: variable 'ansible_search_path' from source: unknown 30529 1726882684.02366: calling self._execute() 30529 1726882684.02431: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.02442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.02454: variable 'omit' from source: magic vars 30529 1726882684.02838: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.02854: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.02865: variable 'omit' from source: magic vars 30529 1726882684.02999: variable 'omit' from source: magic vars 30529 1726882684.03025: variable 'interface' from source: play vars 30529 1726882684.03047: variable 'omit' from source: magic vars 30529 1726882684.03089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882684.03137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.03163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882684.03183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.03204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.03243: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.03251: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.03257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.03369: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.03379: Set connection var ansible_pipelining to False 30529 1726882684.03385: Set connection var ansible_shell_type to sh 30529 1726882684.03400: Set connection var ansible_timeout to 10 30529 1726882684.03407: Set connection var ansible_connection to ssh 30529 1726882684.03415: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.03438: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.03452: variable 'ansible_connection' from source: unknown 30529 1726882684.03561: variable 'ansible_module_compression' from source: unknown 30529 1726882684.03564: variable 'ansible_shell_type' from source: unknown 30529 1726882684.03566: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.03568: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.03570: variable 'ansible_pipelining' from source: unknown 30529 1726882684.03571: variable 'ansible_timeout' from source: unknown 30529 1726882684.03573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.03628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.03645: variable 'omit' from source: magic vars 30529 1726882684.03654: starting attempt loop 30529 1726882684.03659: running the handler 30529 1726882684.03814: variable 'interface_stat' from source: set_fact 30529 1726882684.03827: Evaluated conditional (not interface_stat.stat.exists): True 30529 1726882684.03836: handler run complete 30529 1726882684.03852: attempt loop complete, returning result 30529 1726882684.03858: _execute() done 30529 1726882684.03863: dumping result to json 30529 1726882684.03869: done dumping result, returning 30529 1726882684.03887: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' [12673a56-9f93-b0f1-edc0-000000001f5a] 30529 1726882684.03899: sending task result for task 12673a56-9f93-b0f1-edc0-000000001f5a ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882684.04037: no more pending results, returning what we have 30529 1726882684.04041: results queue empty 30529 1726882684.04042: checking for any_errors_fatal 30529 1726882684.04054: done checking for any_errors_fatal 30529 1726882684.04055: checking for max_fail_percentage 30529 1726882684.04056: done checking for max_fail_percentage 30529 1726882684.04058: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.04059: done checking to see if all hosts have failed 30529 1726882684.04060: getting the remaining hosts for this loop 30529 1726882684.04061: done getting the remaining hosts for this loop 30529 1726882684.04065: getting the next task for host managed_node1 30529 1726882684.04076: done getting next task for host managed_node1 30529 1726882684.04079: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882684.04083: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.04087: getting variables 30529 1726882684.04089: in VariableManager get_vars() 30529 1726882684.04132: Calling all_inventory to load vars for managed_node1 30529 1726882684.04134: Calling groups_inventory to load vars for managed_node1 30529 1726882684.04138: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.04150: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.04154: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.04157: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.04919: done sending task result for task 12673a56-9f93-b0f1-edc0-000000001f5a 30529 1726882684.04922: WORKER PROCESS EXITING 30529 1726882684.06135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.07707: done with get_vars() 30529 1726882684.07730: done getting variables 30529 1726882684.07869: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882684.07990: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:38:04 -0400 (0:00:00.065) 0:01:38.106 ****** 30529 1726882684.08024: entering _queue_task() for managed_node1/debug 30529 1726882684.08491: worker is 1 (out of 1 available) 30529 1726882684.08503: exiting _queue_task() for managed_node1/debug 30529 1726882684.08514: done queuing things up, now waiting for results queue to drain 30529 1726882684.08515: waiting for pending results... 30529 1726882684.08714: running TaskExecutor() for managed_node1/TASK: Success in test 'I can take a profile down that is absent' 30529 1726882684.08834: in run() - task 12673a56-9f93-b0f1-edc0-00000000174b 30529 1726882684.08859: variable 'ansible_search_path' from source: unknown 30529 1726882684.08866: variable 'ansible_search_path' from source: unknown 30529 1726882684.08907: calling self._execute() 30529 1726882684.09006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.09021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.09039: variable 'omit' from source: magic vars 30529 1726882684.09443: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.09467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.09479: variable 'omit' from source: magic vars 30529 1726882684.09527: variable 'omit' from source: magic vars 30529 1726882684.09679: variable 'lsr_description' from source: include params 30529 1726882684.09683: variable 'omit' from source: magic vars 30529 1726882684.09707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882684.09752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.09784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882684.09810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.09834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.09869: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.09897: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.09900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.10004: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.10109: Set connection var ansible_pipelining to False 30529 1726882684.10112: Set connection var ansible_shell_type to sh 30529 1726882684.10115: Set connection var ansible_timeout to 10 30529 1726882684.10117: Set connection var ansible_connection to ssh 30529 1726882684.10119: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.10121: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.10124: variable 'ansible_connection' from source: unknown 30529 1726882684.10126: variable 'ansible_module_compression' from source: unknown 30529 1726882684.10128: variable 'ansible_shell_type' from source: unknown 30529 1726882684.10130: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.10132: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.10134: variable 'ansible_pipelining' from source: unknown 30529 1726882684.10136: variable 'ansible_timeout' from source: unknown 30529 1726882684.10138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.10256: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.10273: variable 'omit' from source: magic vars 30529 1726882684.10283: starting attempt loop 30529 1726882684.10290: running the handler 30529 1726882684.10362: handler run complete 30529 1726882684.10371: attempt loop complete, returning result 30529 1726882684.10397: _execute() done 30529 1726882684.10401: dumping result to json 30529 1726882684.10404: done dumping result, returning 30529 1726882684.10407: done running TaskExecutor() for managed_node1/TASK: Success in test 'I can take a profile down that is absent' [12673a56-9f93-b0f1-edc0-00000000174b] 30529 1726882684.10410: sending task result for task 12673a56-9f93-b0f1-edc0-00000000174b 30529 1726882684.10539: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000174b 30529 1726882684.10543: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 30529 1726882684.10623: no more pending results, returning what we have 30529 1726882684.10626: results queue empty 30529 1726882684.10628: checking for any_errors_fatal 30529 1726882684.10635: done checking for any_errors_fatal 30529 1726882684.10636: checking for max_fail_percentage 30529 1726882684.10638: done checking for max_fail_percentage 30529 1726882684.10639: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.10640: done checking to see if all hosts have failed 30529 1726882684.10641: getting the remaining hosts for this loop 30529 1726882684.10642: done getting the remaining hosts for this loop 30529 1726882684.10646: getting the next task for host managed_node1 30529 1726882684.10770: done getting next task for host managed_node1 30529 1726882684.10773: ^ task is: TASK: Cleanup 30529 1726882684.10777: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.10783: getting variables 30529 1726882684.10785: in VariableManager get_vars() 30529 1726882684.10831: Calling all_inventory to load vars for managed_node1 30529 1726882684.10834: Calling groups_inventory to load vars for managed_node1 30529 1726882684.10838: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.10849: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.10852: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.10855: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.12496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.14783: done with get_vars() 30529 1726882684.14814: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:38:04 -0400 (0:00:00.071) 0:01:38.178 ****** 30529 1726882684.15219: entering _queue_task() for managed_node1/include_tasks 30529 1726882684.16470: worker is 1 (out of 1 available) 30529 1726882684.16483: exiting _queue_task() for managed_node1/include_tasks 30529 1726882684.16498: done queuing things up, now waiting for results queue to drain 30529 1726882684.16500: waiting for pending results... 30529 1726882684.17225: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882684.17335: in run() - task 12673a56-9f93-b0f1-edc0-00000000174f 30529 1726882684.17345: variable 'ansible_search_path' from source: unknown 30529 1726882684.17349: variable 'ansible_search_path' from source: unknown 30529 1726882684.17396: variable 'lsr_cleanup' from source: include params 30529 1726882684.17773: variable 'lsr_cleanup' from source: include params 30529 1726882684.17778: variable 'omit' from source: magic vars 30529 1726882684.17884: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.17897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.17992: variable 'omit' from source: magic vars 30529 1726882684.18144: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.18208: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.18214: variable 'item' from source: unknown 30529 1726882684.18220: variable 'item' from source: unknown 30529 1726882684.18251: variable 'item' from source: unknown 30529 1726882684.18316: variable 'item' from source: unknown 30529 1726882684.18630: dumping result to json 30529 1726882684.18634: done dumping result, returning 30529 1726882684.18637: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-00000000174f] 30529 1726882684.18639: sending task result for task 12673a56-9f93-b0f1-edc0-00000000174f 30529 1726882684.18676: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000174f 30529 1726882684.18678: WORKER PROCESS EXITING 30529 1726882684.18704: no more pending results, returning what we have 30529 1726882684.18709: in VariableManager get_vars() 30529 1726882684.18746: Calling all_inventory to load vars for managed_node1 30529 1726882684.18749: Calling groups_inventory to load vars for managed_node1 30529 1726882684.18752: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.18762: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.18765: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.18768: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.20852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.22756: done with get_vars() 30529 1726882684.22782: variable 'ansible_search_path' from source: unknown 30529 1726882684.22784: variable 'ansible_search_path' from source: unknown 30529 1726882684.22835: we have included files to process 30529 1726882684.22836: generating all_blocks data 30529 1726882684.22838: done generating all_blocks data 30529 1726882684.22844: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882684.22845: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882684.22848: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882684.23058: done processing included file 30529 1726882684.23061: iterating over new_blocks loaded from include file 30529 1726882684.23062: in VariableManager get_vars() 30529 1726882684.23080: done with get_vars() 30529 1726882684.23082: filtering new block on tags 30529 1726882684.23119: done filtering new block on tags 30529 1726882684.23122: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882684.23127: extending task lists for all hosts with included blocks 30529 1726882684.24614: done extending task lists 30529 1726882684.24615: done processing included files 30529 1726882684.24616: results queue empty 30529 1726882684.24617: checking for any_errors_fatal 30529 1726882684.24621: done checking for any_errors_fatal 30529 1726882684.24622: checking for max_fail_percentage 30529 1726882684.24623: done checking for max_fail_percentage 30529 1726882684.24624: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.24624: done checking to see if all hosts have failed 30529 1726882684.24625: getting the remaining hosts for this loop 30529 1726882684.24627: done getting the remaining hosts for this loop 30529 1726882684.24629: getting the next task for host managed_node1 30529 1726882684.24634: done getting next task for host managed_node1 30529 1726882684.24641: ^ task is: TASK: Cleanup profile and device 30529 1726882684.24645: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.24648: getting variables 30529 1726882684.24649: in VariableManager get_vars() 30529 1726882684.24663: Calling all_inventory to load vars for managed_node1 30529 1726882684.24666: Calling groups_inventory to load vars for managed_node1 30529 1726882684.24668: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.24674: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.24676: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.24679: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.26069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.27692: done with get_vars() 30529 1726882684.27722: done getting variables 30529 1726882684.27768: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:38:04 -0400 (0:00:00.125) 0:01:38.304 ****** 30529 1726882684.27810: entering _queue_task() for managed_node1/shell 30529 1726882684.28335: worker is 1 (out of 1 available) 30529 1726882684.28345: exiting _queue_task() for managed_node1/shell 30529 1726882684.28359: done queuing things up, now waiting for results queue to drain 30529 1726882684.28360: waiting for pending results... 30529 1726882684.28784: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882684.28792: in run() - task 12673a56-9f93-b0f1-edc0-00000000200b 30529 1726882684.28798: variable 'ansible_search_path' from source: unknown 30529 1726882684.28800: variable 'ansible_search_path' from source: unknown 30529 1726882684.28803: calling self._execute() 30529 1726882684.28908: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.28919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.28934: variable 'omit' from source: magic vars 30529 1726882684.29327: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.29346: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.29358: variable 'omit' from source: magic vars 30529 1726882684.29413: variable 'omit' from source: magic vars 30529 1726882684.29571: variable 'interface' from source: play vars 30529 1726882684.29599: variable 'omit' from source: magic vars 30529 1726882684.29650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882684.29753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.29756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882684.29759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.29761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.29789: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.29801: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.29810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.29931: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.29942: Set connection var ansible_pipelining to False 30529 1726882684.29949: Set connection var ansible_shell_type to sh 30529 1726882684.29968: Set connection var ansible_timeout to 10 30529 1726882684.29979: Set connection var ansible_connection to ssh 30529 1726882684.29994: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.30079: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.30083: variable 'ansible_connection' from source: unknown 30529 1726882684.30085: variable 'ansible_module_compression' from source: unknown 30529 1726882684.30087: variable 'ansible_shell_type' from source: unknown 30529 1726882684.30091: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.30092: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.30096: variable 'ansible_pipelining' from source: unknown 30529 1726882684.30097: variable 'ansible_timeout' from source: unknown 30529 1726882684.30099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.30299: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.30303: variable 'omit' from source: magic vars 30529 1726882684.30305: starting attempt loop 30529 1726882684.30308: running the handler 30529 1726882684.30311: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.30314: _low_level_execute_command(): starting 30529 1726882684.30329: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882684.31198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882684.31223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882684.31243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882684.31267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.31498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.33068: stdout chunk (state=3): >>>/root <<< 30529 1726882684.33399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882684.33441: stderr chunk (state=3): >>><<< 30529 1726882684.33444: stdout chunk (state=3): >>><<< 30529 1726882684.33516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882684.33520: _low_level_execute_command(): starting 30529 1726882684.33524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276 `" && echo ansible-tmp-1726882684.3346455-35197-280017277744276="` echo /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276 `" ) && sleep 0' 30529 1726882684.34606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882684.34614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882684.34800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882684.34849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.35189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.37101: stdout chunk (state=3): >>>ansible-tmp-1726882684.3346455-35197-280017277744276=/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276 <<< 30529 1726882684.37208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882684.37212: stdout chunk (state=3): >>><<< 30529 1726882684.37227: stderr chunk (state=3): >>><<< 30529 1726882684.37239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882684.3346455-35197-280017277744276=/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882684.37272: variable 'ansible_module_compression' from source: unknown 30529 1726882684.37325: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882684.37362: variable 'ansible_facts' from source: unknown 30529 1726882684.37446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py 30529 1726882684.38017: Sending initial data 30529 1726882684.38036: Sent initial data (156 bytes) 30529 1726882684.38799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882684.38869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882684.38880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882684.39052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882684.39056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882684.39059: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882684.39063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882684.39065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882684.39068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882684.39101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.39172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.40764: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882684.40857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882684.40866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqwlht924 /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py <<< 30529 1726882684.40869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py" <<< 30529 1726882684.41041: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqwlht924" to remote "/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py" <<< 30529 1726882684.41956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882684.41959: stdout chunk (state=3): >>><<< 30529 1726882684.41961: stderr chunk (state=3): >>><<< 30529 1726882684.41963: done transferring module to remote 30529 1726882684.41965: _low_level_execute_command(): starting 30529 1726882684.41967: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/ /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py && sleep 0' 30529 1726882684.42807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882684.42821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882684.42827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882684.42907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882684.42934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882684.42937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882684.42959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.43023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.45099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882684.45102: stdout chunk (state=3): >>><<< 30529 1726882684.45105: stderr chunk (state=3): >>><<< 30529 1726882684.45107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882684.45110: _low_level_execute_command(): starting 30529 1726882684.45112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/AnsiballZ_command.py && sleep 0' 30529 1726882684.46010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882684.46108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882684.46240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882684.46296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882684.46305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.46380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.64304: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:38:04.610656", "end": "2024-09-20 21:38:04.641621", "delta": "0:00:00.030965", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882684.65660: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882684.65665: stdout chunk (state=3): >>><<< 30529 1726882684.65670: stderr chunk (state=3): >>><<< 30529 1726882684.65822: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:38:04.610656", "end": "2024-09-20 21:38:04.641621", "delta": "0:00:00.030965", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882684.65862: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882684.65871: _low_level_execute_command(): starting 30529 1726882684.65876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882684.3346455-35197-280017277744276/ > /dev/null 2>&1 && sleep 0' 30529 1726882684.67048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882684.67057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882684.67302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882684.67306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882684.67309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882684.67348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882684.69153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882684.69157: stderr chunk (state=3): >>><<< 30529 1726882684.69160: stdout chunk (state=3): >>><<< 30529 1726882684.69237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882684.69242: handler run complete 30529 1726882684.69267: Evaluated conditional (False): False 30529 1726882684.69277: attempt loop complete, returning result 30529 1726882684.69280: _execute() done 30529 1726882684.69282: dumping result to json 30529 1726882684.69290: done dumping result, returning 30529 1726882684.69399: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-00000000200b] 30529 1726882684.69403: sending task result for task 12673a56-9f93-b0f1-edc0-00000000200b 30529 1726882684.69539: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000200b 30529 1726882684.69543: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.030965", "end": "2024-09-20 21:38:04.641621", "rc": 1, "start": "2024-09-20 21:38:04.610656" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882684.69618: no more pending results, returning what we have 30529 1726882684.69622: results queue empty 30529 1726882684.69623: checking for any_errors_fatal 30529 1726882684.69624: done checking for any_errors_fatal 30529 1726882684.69625: checking for max_fail_percentage 30529 1726882684.69627: done checking for max_fail_percentage 30529 1726882684.69628: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.69629: done checking to see if all hosts have failed 30529 1726882684.69630: getting the remaining hosts for this loop 30529 1726882684.69632: done getting the remaining hosts for this loop 30529 1726882684.69636: getting the next task for host managed_node1 30529 1726882684.69650: done getting next task for host managed_node1 30529 1726882684.69653: ^ task is: TASK: Include the task 'run_test.yml' 30529 1726882684.69655: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.69660: getting variables 30529 1726882684.69662: in VariableManager get_vars() 30529 1726882684.69708: Calling all_inventory to load vars for managed_node1 30529 1726882684.69711: Calling groups_inventory to load vars for managed_node1 30529 1726882684.69715: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.69727: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.69730: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.69733: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.73082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.76496: done with get_vars() 30529 1726882684.76600: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 21:38:04 -0400 (0:00:00.488) 0:01:38.793 ****** 30529 1726882684.76703: entering _queue_task() for managed_node1/include_tasks 30529 1726882684.77551: worker is 1 (out of 1 available) 30529 1726882684.77564: exiting _queue_task() for managed_node1/include_tasks 30529 1726882684.77577: done queuing things up, now waiting for results queue to drain 30529 1726882684.77579: waiting for pending results... 30529 1726882684.78210: running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' 30529 1726882684.78215: in run() - task 12673a56-9f93-b0f1-edc0-000000000017 30529 1726882684.78218: variable 'ansible_search_path' from source: unknown 30529 1726882684.78244: calling self._execute() 30529 1726882684.78347: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.78698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.78701: variable 'omit' from source: magic vars 30529 1726882684.79198: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.79498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.79502: _execute() done 30529 1726882684.79504: dumping result to json 30529 1726882684.79506: done dumping result, returning 30529 1726882684.79508: done running TaskExecutor() for managed_node1/TASK: Include the task 'run_test.yml' [12673a56-9f93-b0f1-edc0-000000000017] 30529 1726882684.79510: sending task result for task 12673a56-9f93-b0f1-edc0-000000000017 30529 1726882684.79608: done sending task result for task 12673a56-9f93-b0f1-edc0-000000000017 30529 1726882684.79612: WORKER PROCESS EXITING 30529 1726882684.79640: no more pending results, returning what we have 30529 1726882684.79646: in VariableManager get_vars() 30529 1726882684.79810: Calling all_inventory to load vars for managed_node1 30529 1726882684.79814: Calling groups_inventory to load vars for managed_node1 30529 1726882684.79818: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.79831: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.79834: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.79838: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.81959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.83726: done with get_vars() 30529 1726882684.83749: variable 'ansible_search_path' from source: unknown 30529 1726882684.83769: we have included files to process 30529 1726882684.83771: generating all_blocks data 30529 1726882684.83772: done generating all_blocks data 30529 1726882684.83777: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882684.83778: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882684.83781: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30529 1726882684.84247: in VariableManager get_vars() 30529 1726882684.84337: done with get_vars() 30529 1726882684.84388: in VariableManager get_vars() 30529 1726882684.84422: done with get_vars() 30529 1726882684.84500: in VariableManager get_vars() 30529 1726882684.84581: done with get_vars() 30529 1726882684.84626: in VariableManager get_vars() 30529 1726882684.84770: done with get_vars() 30529 1726882684.84812: in VariableManager get_vars() 30529 1726882684.84830: done with get_vars() 30529 1726882684.85282: in VariableManager get_vars() 30529 1726882684.85310: done with get_vars() 30529 1726882684.85322: done processing included file 30529 1726882684.85324: iterating over new_blocks loaded from include file 30529 1726882684.85326: in VariableManager get_vars() 30529 1726882684.85339: done with get_vars() 30529 1726882684.85340: filtering new block on tags 30529 1726882684.85448: done filtering new block on tags 30529 1726882684.85451: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node1 30529 1726882684.85457: extending task lists for all hosts with included blocks 30529 1726882684.85489: done extending task lists 30529 1726882684.85490: done processing included files 30529 1726882684.85491: results queue empty 30529 1726882684.85491: checking for any_errors_fatal 30529 1726882684.85498: done checking for any_errors_fatal 30529 1726882684.85499: checking for max_fail_percentage 30529 1726882684.85500: done checking for max_fail_percentage 30529 1726882684.85501: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.85508: done checking to see if all hosts have failed 30529 1726882684.85509: getting the remaining hosts for this loop 30529 1726882684.85510: done getting the remaining hosts for this loop 30529 1726882684.85513: getting the next task for host managed_node1 30529 1726882684.85517: done getting next task for host managed_node1 30529 1726882684.85519: ^ task is: TASK: TEST: {{ lsr_description }} 30529 1726882684.85521: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.85523: getting variables 30529 1726882684.85524: in VariableManager get_vars() 30529 1726882684.85534: Calling all_inventory to load vars for managed_node1 30529 1726882684.85536: Calling groups_inventory to load vars for managed_node1 30529 1726882684.85538: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.85543: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.85545: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.85548: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.86940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.88534: done with get_vars() 30529 1726882684.88564: done getting variables 30529 1726882684.88607: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882684.88724: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:38:04 -0400 (0:00:00.120) 0:01:38.913 ****** 30529 1726882684.88755: entering _queue_task() for managed_node1/debug 30529 1726882684.89160: worker is 1 (out of 1 available) 30529 1726882684.89172: exiting _queue_task() for managed_node1/debug 30529 1726882684.89186: done queuing things up, now waiting for results queue to drain 30529 1726882684.89187: waiting for pending results... 30529 1726882684.89533: running TaskExecutor() for managed_node1/TASK: TEST: I will not get an error when I try to remove an absent profile 30529 1726882684.89647: in run() - task 12673a56-9f93-b0f1-edc0-0000000020ad 30529 1726882684.89661: variable 'ansible_search_path' from source: unknown 30529 1726882684.89664: variable 'ansible_search_path' from source: unknown 30529 1726882684.89701: calling self._execute() 30529 1726882684.89802: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.89806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.89826: variable 'omit' from source: magic vars 30529 1726882684.90398: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.90401: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.90404: variable 'omit' from source: magic vars 30529 1726882684.90407: variable 'omit' from source: magic vars 30529 1726882684.90454: variable 'lsr_description' from source: include params 30529 1726882684.90472: variable 'omit' from source: magic vars 30529 1726882684.90522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882684.90559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.90579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882684.90602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.90623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.90649: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.90653: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.90655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.90899: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.90903: Set connection var ansible_pipelining to False 30529 1726882684.90905: Set connection var ansible_shell_type to sh 30529 1726882684.90908: Set connection var ansible_timeout to 10 30529 1726882684.90910: Set connection var ansible_connection to ssh 30529 1726882684.90913: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.90915: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.90918: variable 'ansible_connection' from source: unknown 30529 1726882684.90921: variable 'ansible_module_compression' from source: unknown 30529 1726882684.90924: variable 'ansible_shell_type' from source: unknown 30529 1726882684.90928: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.90930: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.90932: variable 'ansible_pipelining' from source: unknown 30529 1726882684.90935: variable 'ansible_timeout' from source: unknown 30529 1726882684.90938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.91019: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.91029: variable 'omit' from source: magic vars 30529 1726882684.91035: starting attempt loop 30529 1726882684.91038: running the handler 30529 1726882684.91229: handler run complete 30529 1726882684.91499: attempt loop complete, returning result 30529 1726882684.91502: _execute() done 30529 1726882684.91504: dumping result to json 30529 1726882684.91506: done dumping result, returning 30529 1726882684.91508: done running TaskExecutor() for managed_node1/TASK: TEST: I will not get an error when I try to remove an absent profile [12673a56-9f93-b0f1-edc0-0000000020ad] 30529 1726882684.91510: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020ad 30529 1726882684.91568: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020ad 30529 1726882684.91571: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 30529 1726882684.91615: no more pending results, returning what we have 30529 1726882684.91619: results queue empty 30529 1726882684.91620: checking for any_errors_fatal 30529 1726882684.91621: done checking for any_errors_fatal 30529 1726882684.91627: checking for max_fail_percentage 30529 1726882684.91629: done checking for max_fail_percentage 30529 1726882684.91630: checking to see if all hosts have failed and the running result is not ok 30529 1726882684.91631: done checking to see if all hosts have failed 30529 1726882684.91631: getting the remaining hosts for this loop 30529 1726882684.91633: done getting the remaining hosts for this loop 30529 1726882684.91637: getting the next task for host managed_node1 30529 1726882684.91644: done getting next task for host managed_node1 30529 1726882684.91647: ^ task is: TASK: Show item 30529 1726882684.91650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882684.91654: getting variables 30529 1726882684.91656: in VariableManager get_vars() 30529 1726882684.91695: Calling all_inventory to load vars for managed_node1 30529 1726882684.91699: Calling groups_inventory to load vars for managed_node1 30529 1726882684.91703: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882684.91713: Calling all_plugins_play to load vars for managed_node1 30529 1726882684.91716: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882684.91720: Calling groups_plugins_play to load vars for managed_node1 30529 1726882684.93232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882684.95033: done with get_vars() 30529 1726882684.95059: done getting variables 30529 1726882684.95129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:38:04 -0400 (0:00:00.064) 0:01:38.977 ****** 30529 1726882684.95160: entering _queue_task() for managed_node1/debug 30529 1726882684.95790: worker is 1 (out of 1 available) 30529 1726882684.95802: exiting _queue_task() for managed_node1/debug 30529 1726882684.95812: done queuing things up, now waiting for results queue to drain 30529 1726882684.95814: waiting for pending results... 30529 1726882684.95879: running TaskExecutor() for managed_node1/TASK: Show item 30529 1726882684.95979: in run() - task 12673a56-9f93-b0f1-edc0-0000000020ae 30529 1726882684.95996: variable 'ansible_search_path' from source: unknown 30529 1726882684.96001: variable 'ansible_search_path' from source: unknown 30529 1726882684.96299: variable 'omit' from source: magic vars 30529 1726882684.96303: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.96306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.96309: variable 'omit' from source: magic vars 30529 1726882684.96618: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.96630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.96636: variable 'omit' from source: magic vars 30529 1726882684.96673: variable 'omit' from source: magic vars 30529 1726882684.96727: variable 'item' from source: unknown 30529 1726882684.96795: variable 'item' from source: unknown 30529 1726882684.96823: variable 'omit' from source: magic vars 30529 1726882684.96873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882684.96908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.96939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882684.96956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.96971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.97002: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.97006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.97009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.97128: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.97146: Set connection var ansible_pipelining to False 30529 1726882684.97153: Set connection var ansible_shell_type to sh 30529 1726882684.97176: Set connection var ansible_timeout to 10 30529 1726882684.97179: Set connection var ansible_connection to ssh 30529 1726882684.97182: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.97199: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.97202: variable 'ansible_connection' from source: unknown 30529 1726882684.97204: variable 'ansible_module_compression' from source: unknown 30529 1726882684.97206: variable 'ansible_shell_type' from source: unknown 30529 1726882684.97209: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.97211: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.97217: variable 'ansible_pipelining' from source: unknown 30529 1726882684.97219: variable 'ansible_timeout' from source: unknown 30529 1726882684.97309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.97567: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.97571: variable 'omit' from source: magic vars 30529 1726882684.97825: starting attempt loop 30529 1726882684.97829: running the handler 30529 1726882684.97874: variable 'lsr_description' from source: include params 30529 1726882684.97949: variable 'lsr_description' from source: include params 30529 1726882684.97959: handler run complete 30529 1726882684.97977: attempt loop complete, returning result 30529 1726882684.97995: variable 'item' from source: unknown 30529 1726882684.98192: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 30529 1726882684.98576: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.98579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.98586: variable 'omit' from source: magic vars 30529 1726882684.98997: variable 'ansible_distribution_major_version' from source: facts 30529 1726882684.99001: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882684.99003: variable 'omit' from source: magic vars 30529 1726882684.99005: variable 'omit' from source: magic vars 30529 1726882684.99009: variable 'item' from source: unknown 30529 1726882684.99011: variable 'item' from source: unknown 30529 1726882684.99013: variable 'omit' from source: magic vars 30529 1726882684.99016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882684.99018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.99020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882684.99066: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882684.99069: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.99071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.99134: Set connection var ansible_shell_executable to /bin/sh 30529 1726882684.99154: Set connection var ansible_pipelining to False 30529 1726882684.99158: Set connection var ansible_shell_type to sh 30529 1726882684.99168: Set connection var ansible_timeout to 10 30529 1726882684.99171: Set connection var ansible_connection to ssh 30529 1726882684.99178: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882684.99197: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.99206: variable 'ansible_connection' from source: unknown 30529 1726882684.99209: variable 'ansible_module_compression' from source: unknown 30529 1726882684.99219: variable 'ansible_shell_type' from source: unknown 30529 1726882684.99222: variable 'ansible_shell_executable' from source: unknown 30529 1726882684.99226: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.99230: variable 'ansible_pipelining' from source: unknown 30529 1726882684.99232: variable 'ansible_timeout' from source: unknown 30529 1726882684.99237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.99346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882684.99356: variable 'omit' from source: magic vars 30529 1726882684.99359: starting attempt loop 30529 1726882684.99361: running the handler 30529 1726882684.99385: variable 'lsr_setup' from source: include params 30529 1726882684.99468: variable 'lsr_setup' from source: include params 30529 1726882684.99517: handler run complete 30529 1726882684.99534: attempt loop complete, returning result 30529 1726882684.99555: variable 'item' from source: unknown 30529 1726882684.99620: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 30529 1726882684.99717: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882684.99720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882684.99722: variable 'omit' from source: magic vars 30529 1726882684.99998: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.00001: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.00004: variable 'omit' from source: magic vars 30529 1726882685.00006: variable 'omit' from source: magic vars 30529 1726882685.00008: variable 'item' from source: unknown 30529 1726882685.00011: variable 'item' from source: unknown 30529 1726882685.00026: variable 'omit' from source: magic vars 30529 1726882685.00044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.00055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.00058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.00067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.00197: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.00201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.00203: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.00205: Set connection var ansible_pipelining to False 30529 1726882685.00207: Set connection var ansible_shell_type to sh 30529 1726882685.00209: Set connection var ansible_timeout to 10 30529 1726882685.00211: Set connection var ansible_connection to ssh 30529 1726882685.00213: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.00215: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.00217: variable 'ansible_connection' from source: unknown 30529 1726882685.00219: variable 'ansible_module_compression' from source: unknown 30529 1726882685.00221: variable 'ansible_shell_type' from source: unknown 30529 1726882685.00223: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.00225: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.00227: variable 'ansible_pipelining' from source: unknown 30529 1726882685.00229: variable 'ansible_timeout' from source: unknown 30529 1726882685.00233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.00330: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.00339: variable 'omit' from source: magic vars 30529 1726882685.00342: starting attempt loop 30529 1726882685.00344: running the handler 30529 1726882685.00365: variable 'lsr_test' from source: include params 30529 1726882685.00435: variable 'lsr_test' from source: include params 30529 1726882685.00456: handler run complete 30529 1726882685.00464: attempt loop complete, returning result 30529 1726882685.00478: variable 'item' from source: unknown 30529 1726882685.00568: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30529 1726882685.00641: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.00645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.00648: variable 'omit' from source: magic vars 30529 1726882685.00998: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.01002: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.01004: variable 'omit' from source: magic vars 30529 1726882685.01007: variable 'omit' from source: magic vars 30529 1726882685.01009: variable 'item' from source: unknown 30529 1726882685.01116: variable 'item' from source: unknown 30529 1726882685.01130: variable 'omit' from source: magic vars 30529 1726882685.01146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.01218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.01222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.01244: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.01247: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.01250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.01450: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.01453: Set connection var ansible_pipelining to False 30529 1726882685.01456: Set connection var ansible_shell_type to sh 30529 1726882685.01466: Set connection var ansible_timeout to 10 30529 1726882685.01468: Set connection var ansible_connection to ssh 30529 1726882685.01473: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.01494: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.01617: variable 'ansible_connection' from source: unknown 30529 1726882685.01620: variable 'ansible_module_compression' from source: unknown 30529 1726882685.01623: variable 'ansible_shell_type' from source: unknown 30529 1726882685.01625: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.01627: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.01628: variable 'ansible_pipelining' from source: unknown 30529 1726882685.01630: variable 'ansible_timeout' from source: unknown 30529 1726882685.01632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.01868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.01871: variable 'omit' from source: magic vars 30529 1726882685.01874: starting attempt loop 30529 1726882685.01930: running the handler 30529 1726882685.01976: variable 'lsr_assert' from source: include params 30529 1726882685.02115: variable 'lsr_assert' from source: include params 30529 1726882685.02134: handler run complete 30529 1726882685.02147: attempt loop complete, returning result 30529 1726882685.02197: variable 'item' from source: unknown 30529 1726882685.02346: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 30529 1726882685.02638: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.02643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.02646: variable 'omit' from source: magic vars 30529 1726882685.02901: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.02910: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.02912: variable 'omit' from source: magic vars 30529 1726882685.03000: variable 'omit' from source: magic vars 30529 1726882685.03003: variable 'item' from source: unknown 30529 1726882685.03033: variable 'item' from source: unknown 30529 1726882685.03047: variable 'omit' from source: magic vars 30529 1726882685.03073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.03080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.03092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.03107: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.03110: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.03112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.03184: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.03190: Set connection var ansible_pipelining to False 30529 1726882685.03195: Set connection var ansible_shell_type to sh 30529 1726882685.03217: Set connection var ansible_timeout to 10 30529 1726882685.03220: Set connection var ansible_connection to ssh 30529 1726882685.03222: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.03299: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.03302: variable 'ansible_connection' from source: unknown 30529 1726882685.03304: variable 'ansible_module_compression' from source: unknown 30529 1726882685.03307: variable 'ansible_shell_type' from source: unknown 30529 1726882685.03310: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.03312: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.03315: variable 'ansible_pipelining' from source: unknown 30529 1726882685.03319: variable 'ansible_timeout' from source: unknown 30529 1726882685.03325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.03353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.03360: variable 'omit' from source: magic vars 30529 1726882685.03362: starting attempt loop 30529 1726882685.03365: running the handler 30529 1726882685.03394: variable 'lsr_assert_when' from source: include params 30529 1726882685.03456: variable 'lsr_assert_when' from source: include params 30529 1726882685.03598: variable 'network_provider' from source: set_fact 30529 1726882685.03601: handler run complete 30529 1726882685.03603: attempt loop complete, returning result 30529 1726882685.03608: variable 'item' from source: unknown 30529 1726882685.03667: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30529 1726882685.03752: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.03870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.03873: variable 'omit' from source: magic vars 30529 1726882685.03992: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.03997: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.03999: variable 'omit' from source: magic vars 30529 1726882685.04001: variable 'omit' from source: magic vars 30529 1726882685.04003: variable 'item' from source: unknown 30529 1726882685.04021: variable 'item' from source: unknown 30529 1726882685.04041: variable 'omit' from source: magic vars 30529 1726882685.04058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.04065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.04071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.04081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.04084: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.04087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.04164: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.04168: Set connection var ansible_pipelining to False 30529 1726882685.04170: Set connection var ansible_shell_type to sh 30529 1726882685.04180: Set connection var ansible_timeout to 10 30529 1726882685.04183: Set connection var ansible_connection to ssh 30529 1726882685.04190: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.04210: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.04213: variable 'ansible_connection' from source: unknown 30529 1726882685.04216: variable 'ansible_module_compression' from source: unknown 30529 1726882685.04218: variable 'ansible_shell_type' from source: unknown 30529 1726882685.04220: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.04223: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.04298: variable 'ansible_pipelining' from source: unknown 30529 1726882685.04301: variable 'ansible_timeout' from source: unknown 30529 1726882685.04304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.04324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.04331: variable 'omit' from source: magic vars 30529 1726882685.04334: starting attempt loop 30529 1726882685.04336: running the handler 30529 1726882685.04356: variable 'lsr_fail_debug' from source: play vars 30529 1726882685.04423: variable 'lsr_fail_debug' from source: play vars 30529 1726882685.04440: handler run complete 30529 1726882685.04453: attempt loop complete, returning result 30529 1726882685.04474: variable 'item' from source: unknown 30529 1726882685.04533: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30529 1726882685.04615: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.04618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.04751: variable 'omit' from source: magic vars 30529 1726882685.04785: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.04791: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.04803: variable 'omit' from source: magic vars 30529 1726882685.04817: variable 'omit' from source: magic vars 30529 1726882685.04860: variable 'item' from source: unknown 30529 1726882685.04969: variable 'item' from source: unknown 30529 1726882685.04989: variable 'omit' from source: magic vars 30529 1726882685.05034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.05042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.05047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.05085: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.05088: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.05102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.05197: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.05247: Set connection var ansible_pipelining to False 30529 1726882685.05250: Set connection var ansible_shell_type to sh 30529 1726882685.05278: Set connection var ansible_timeout to 10 30529 1726882685.05281: Set connection var ansible_connection to ssh 30529 1726882685.05303: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.05363: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.05366: variable 'ansible_connection' from source: unknown 30529 1726882685.05368: variable 'ansible_module_compression' from source: unknown 30529 1726882685.05371: variable 'ansible_shell_type' from source: unknown 30529 1726882685.05373: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.05418: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.05421: variable 'ansible_pipelining' from source: unknown 30529 1726882685.05423: variable 'ansible_timeout' from source: unknown 30529 1726882685.05425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.05539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.05647: variable 'omit' from source: magic vars 30529 1726882685.05650: starting attempt loop 30529 1726882685.05653: running the handler 30529 1726882685.05655: variable 'lsr_cleanup' from source: include params 30529 1726882685.05718: variable 'lsr_cleanup' from source: include params 30529 1726882685.05741: handler run complete 30529 1726882685.05808: attempt loop complete, returning result 30529 1726882685.05824: variable 'item' from source: unknown 30529 1726882685.05905: variable 'item' from source: unknown ok: [managed_node1] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 30529 1726882685.05983: dumping result to json 30529 1726882685.06102: done dumping result, returning 30529 1726882685.06105: done running TaskExecutor() for managed_node1/TASK: Show item [12673a56-9f93-b0f1-edc0-0000000020ae] 30529 1726882685.06107: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020ae 30529 1726882685.06148: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020ae 30529 1726882685.06151: WORKER PROCESS EXITING 30529 1726882685.06264: no more pending results, returning what we have 30529 1726882685.06268: results queue empty 30529 1726882685.06269: checking for any_errors_fatal 30529 1726882685.06279: done checking for any_errors_fatal 30529 1726882685.06280: checking for max_fail_percentage 30529 1726882685.06281: done checking for max_fail_percentage 30529 1726882685.06284: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.06285: done checking to see if all hosts have failed 30529 1726882685.06285: getting the remaining hosts for this loop 30529 1726882685.06288: done getting the remaining hosts for this loop 30529 1726882685.06292: getting the next task for host managed_node1 30529 1726882685.06303: done getting next task for host managed_node1 30529 1726882685.06306: ^ task is: TASK: Include the task 'show_interfaces.yml' 30529 1726882685.06308: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.06312: getting variables 30529 1726882685.06314: in VariableManager get_vars() 30529 1726882685.06365: Calling all_inventory to load vars for managed_node1 30529 1726882685.06369: Calling groups_inventory to load vars for managed_node1 30529 1726882685.06374: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.06385: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.06389: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.06392: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.09584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.11558: done with get_vars() 30529 1726882685.11602: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:38:05 -0400 (0:00:00.165) 0:01:39.143 ****** 30529 1726882685.11707: entering _queue_task() for managed_node1/include_tasks 30529 1726882685.12130: worker is 1 (out of 1 available) 30529 1726882685.12142: exiting _queue_task() for managed_node1/include_tasks 30529 1726882685.12155: done queuing things up, now waiting for results queue to drain 30529 1726882685.12156: waiting for pending results... 30529 1726882685.12569: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 30529 1726882685.12576: in run() - task 12673a56-9f93-b0f1-edc0-0000000020af 30529 1726882685.12585: variable 'ansible_search_path' from source: unknown 30529 1726882685.12589: variable 'ansible_search_path' from source: unknown 30529 1726882685.12637: calling self._execute() 30529 1726882685.12740: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.12746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.12800: variable 'omit' from source: magic vars 30529 1726882685.13471: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.13523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.13567: _execute() done 30529 1726882685.13590: dumping result to json 30529 1726882685.13600: done dumping result, returning 30529 1726882685.13669: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-b0f1-edc0-0000000020af] 30529 1726882685.13673: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020af 30529 1726882685.13745: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020af 30529 1726882685.13749: WORKER PROCESS EXITING 30529 1726882685.13818: no more pending results, returning what we have 30529 1726882685.13824: in VariableManager get_vars() 30529 1726882685.13873: Calling all_inventory to load vars for managed_node1 30529 1726882685.13875: Calling groups_inventory to load vars for managed_node1 30529 1726882685.13879: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.13900: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.13905: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.13908: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.16226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.18214: done with get_vars() 30529 1726882685.18238: variable 'ansible_search_path' from source: unknown 30529 1726882685.18243: variable 'ansible_search_path' from source: unknown 30529 1726882685.18307: we have included files to process 30529 1726882685.18309: generating all_blocks data 30529 1726882685.18311: done generating all_blocks data 30529 1726882685.18319: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882685.18320: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882685.18322: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30529 1726882685.18520: in VariableManager get_vars() 30529 1726882685.18542: done with get_vars() 30529 1726882685.18758: done processing included file 30529 1726882685.18762: iterating over new_blocks loaded from include file 30529 1726882685.18783: in VariableManager get_vars() 30529 1726882685.18820: done with get_vars() 30529 1726882685.18822: filtering new block on tags 30529 1726882685.18856: done filtering new block on tags 30529 1726882685.18858: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 30529 1726882685.18863: extending task lists for all hosts with included blocks 30529 1726882685.19476: done extending task lists 30529 1726882685.19477: done processing included files 30529 1726882685.19478: results queue empty 30529 1726882685.19479: checking for any_errors_fatal 30529 1726882685.19485: done checking for any_errors_fatal 30529 1726882685.19486: checking for max_fail_percentage 30529 1726882685.19490: done checking for max_fail_percentage 30529 1726882685.19491: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.19492: done checking to see if all hosts have failed 30529 1726882685.19494: getting the remaining hosts for this loop 30529 1726882685.19495: done getting the remaining hosts for this loop 30529 1726882685.19498: getting the next task for host managed_node1 30529 1726882685.19528: done getting next task for host managed_node1 30529 1726882685.19533: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30529 1726882685.19537: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.19540: getting variables 30529 1726882685.19541: in VariableManager get_vars() 30529 1726882685.19577: Calling all_inventory to load vars for managed_node1 30529 1726882685.19579: Calling groups_inventory to load vars for managed_node1 30529 1726882685.19582: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.19591: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.19595: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.19599: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.21474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.24047: done with get_vars() 30529 1726882685.24082: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:38:05 -0400 (0:00:00.124) 0:01:39.267 ****** 30529 1726882685.24180: entering _queue_task() for managed_node1/include_tasks 30529 1726882685.24816: worker is 1 (out of 1 available) 30529 1726882685.24833: exiting _queue_task() for managed_node1/include_tasks 30529 1726882685.24856: done queuing things up, now waiting for results queue to drain 30529 1726882685.24858: waiting for pending results... 30529 1726882685.25211: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 30529 1726882685.25321: in run() - task 12673a56-9f93-b0f1-edc0-0000000020d6 30529 1726882685.25426: variable 'ansible_search_path' from source: unknown 30529 1726882685.25430: variable 'ansible_search_path' from source: unknown 30529 1726882685.25435: calling self._execute() 30529 1726882685.25520: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.25532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.25554: variable 'omit' from source: magic vars 30529 1726882685.26016: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.26034: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.26046: _execute() done 30529 1726882685.26054: dumping result to json 30529 1726882685.26060: done dumping result, returning 30529 1726882685.26070: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-b0f1-edc0-0000000020d6] 30529 1726882685.26080: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020d6 30529 1726882685.26336: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020d6 30529 1726882685.26341: WORKER PROCESS EXITING 30529 1726882685.26368: no more pending results, returning what we have 30529 1726882685.26373: in VariableManager get_vars() 30529 1726882685.26621: Calling all_inventory to load vars for managed_node1 30529 1726882685.26624: Calling groups_inventory to load vars for managed_node1 30529 1726882685.26630: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.26645: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.26650: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.26654: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.28592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.35148: done with get_vars() 30529 1726882685.35165: variable 'ansible_search_path' from source: unknown 30529 1726882685.35166: variable 'ansible_search_path' from source: unknown 30529 1726882685.35191: we have included files to process 30529 1726882685.35195: generating all_blocks data 30529 1726882685.35197: done generating all_blocks data 30529 1726882685.35201: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882685.35202: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882685.35205: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30529 1726882685.35372: done processing included file 30529 1726882685.35373: iterating over new_blocks loaded from include file 30529 1726882685.35374: in VariableManager get_vars() 30529 1726882685.35387: done with get_vars() 30529 1726882685.35388: filtering new block on tags 30529 1726882685.35412: done filtering new block on tags 30529 1726882685.35413: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 30529 1726882685.35416: extending task lists for all hosts with included blocks 30529 1726882685.35524: done extending task lists 30529 1726882685.35528: done processing included files 30529 1726882685.35529: results queue empty 30529 1726882685.35529: checking for any_errors_fatal 30529 1726882685.35532: done checking for any_errors_fatal 30529 1726882685.35533: checking for max_fail_percentage 30529 1726882685.35534: done checking for max_fail_percentage 30529 1726882685.35534: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.35535: done checking to see if all hosts have failed 30529 1726882685.35536: getting the remaining hosts for this loop 30529 1726882685.35537: done getting the remaining hosts for this loop 30529 1726882685.35540: getting the next task for host managed_node1 30529 1726882685.35543: done getting next task for host managed_node1 30529 1726882685.35545: ^ task is: TASK: Gather current interface info 30529 1726882685.35547: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.35549: getting variables 30529 1726882685.35549: in VariableManager get_vars() 30529 1726882685.35556: Calling all_inventory to load vars for managed_node1 30529 1726882685.35557: Calling groups_inventory to load vars for managed_node1 30529 1726882685.35559: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.35563: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.35564: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.35567: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.36266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.37113: done with get_vars() 30529 1726882685.37129: done getting variables 30529 1726882685.37154: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:38:05 -0400 (0:00:00.129) 0:01:39.397 ****** 30529 1726882685.37172: entering _queue_task() for managed_node1/command 30529 1726882685.37547: worker is 1 (out of 1 available) 30529 1726882685.37558: exiting _queue_task() for managed_node1/command 30529 1726882685.37571: done queuing things up, now waiting for results queue to drain 30529 1726882685.37573: waiting for pending results... 30529 1726882685.37864: running TaskExecutor() for managed_node1/TASK: Gather current interface info 30529 1726882685.38198: in run() - task 12673a56-9f93-b0f1-edc0-000000002111 30529 1726882685.38203: variable 'ansible_search_path' from source: unknown 30529 1726882685.38206: variable 'ansible_search_path' from source: unknown 30529 1726882685.38208: calling self._execute() 30529 1726882685.38211: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.38213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.38216: variable 'omit' from source: magic vars 30529 1726882685.38587: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.38608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.38619: variable 'omit' from source: magic vars 30529 1726882685.38682: variable 'omit' from source: magic vars 30529 1726882685.38720: variable 'omit' from source: magic vars 30529 1726882685.38768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882685.38809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.38836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882685.38861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.38887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.38923: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.38932: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.38941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.39053: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.39064: Set connection var ansible_pipelining to False 30529 1726882685.39071: Set connection var ansible_shell_type to sh 30529 1726882685.39085: Set connection var ansible_timeout to 10 30529 1726882685.39098: Set connection var ansible_connection to ssh 30529 1726882685.39109: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.39136: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.39144: variable 'ansible_connection' from source: unknown 30529 1726882685.39152: variable 'ansible_module_compression' from source: unknown 30529 1726882685.39159: variable 'ansible_shell_type' from source: unknown 30529 1726882685.39165: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.39172: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.39181: variable 'ansible_pipelining' from source: unknown 30529 1726882685.39187: variable 'ansible_timeout' from source: unknown 30529 1726882685.39198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.39339: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.39360: variable 'omit' from source: magic vars 30529 1726882685.39370: starting attempt loop 30529 1726882685.39417: running the handler 30529 1726882685.39420: _low_level_execute_command(): starting 30529 1726882685.39423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882685.40145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882685.40159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882685.40190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882685.40300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882685.40316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882685.40337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.40424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.42125: stdout chunk (state=3): >>>/root <<< 30529 1726882685.42289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882685.42297: stdout chunk (state=3): >>><<< 30529 1726882685.42300: stderr chunk (state=3): >>><<< 30529 1726882685.42328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882685.42430: _low_level_execute_command(): starting 30529 1726882685.42434: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871 `" && echo ansible-tmp-1726882685.4233527-35227-88470474517871="` echo /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871 `" ) && sleep 0' 30529 1726882685.42959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882685.42974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882685.42990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882685.43009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882685.43033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882685.43053: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882685.43112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.43170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882685.43187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882685.43212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.43286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.45170: stdout chunk (state=3): >>>ansible-tmp-1726882685.4233527-35227-88470474517871=/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871 <<< 30529 1726882685.45306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882685.45325: stdout chunk (state=3): >>><<< 30529 1726882685.45337: stderr chunk (state=3): >>><<< 30529 1726882685.45358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882685.4233527-35227-88470474517871=/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882685.45498: variable 'ansible_module_compression' from source: unknown 30529 1726882685.45501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882685.45504: variable 'ansible_facts' from source: unknown 30529 1726882685.45585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py 30529 1726882685.45758: Sending initial data 30529 1726882685.45761: Sent initial data (155 bytes) 30529 1726882685.46504: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.46560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882685.46585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882685.46618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.46686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.48224: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882685.48267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882685.48308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5_dvaw9p /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py <<< 30529 1726882685.48311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py" <<< 30529 1726882685.48354: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5_dvaw9p" to remote "/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py" <<< 30529 1726882685.48892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882685.48999: stderr chunk (state=3): >>><<< 30529 1726882685.49002: stdout chunk (state=3): >>><<< 30529 1726882685.49004: done transferring module to remote 30529 1726882685.49007: _low_level_execute_command(): starting 30529 1726882685.49009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/ /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py && sleep 0' 30529 1726882685.49402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882685.49405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882685.49452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.49460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.49524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.49556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.51273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882685.51298: stderr chunk (state=3): >>><<< 30529 1726882685.51302: stdout chunk (state=3): >>><<< 30529 1726882685.51310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882685.51314: _low_level_execute_command(): starting 30529 1726882685.51319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/AnsiballZ_command.py && sleep 0' 30529 1726882685.51733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882685.51736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882685.51739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882685.51741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882685.51743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.51784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882685.51800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.51852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.67309: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:05.668861", "end": "2024-09-20 21:38:05.671922", "delta": "0:00:00.003061", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882685.68777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882685.68782: stdout chunk (state=3): >>><<< 30529 1726882685.68784: stderr chunk (state=3): >>><<< 30529 1726882685.68936: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:05.668861", "end": "2024-09-20 21:38:05.671922", "delta": "0:00:00.003061", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882685.68941: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882685.68944: _low_level_execute_command(): starting 30529 1726882685.68946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882685.4233527-35227-88470474517871/ > /dev/null 2>&1 && sleep 0' 30529 1726882685.69533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882685.69546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882685.69562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882685.69580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882685.69618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882685.69721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882685.69744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882685.69775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882685.69843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882685.71760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882685.71763: stdout chunk (state=3): >>><<< 30529 1726882685.71766: stderr chunk (state=3): >>><<< 30529 1726882685.71769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882685.71771: handler run complete 30529 1726882685.71773: Evaluated conditional (False): False 30529 1726882685.71775: attempt loop complete, returning result 30529 1726882685.71776: _execute() done 30529 1726882685.71778: dumping result to json 30529 1726882685.71780: done dumping result, returning 30529 1726882685.71805: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-b0f1-edc0-000000002111] 30529 1726882685.71815: sending task result for task 12673a56-9f93-b0f1-edc0-000000002111 ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003061", "end": "2024-09-20 21:38:05.671922", "rc": 0, "start": "2024-09-20 21:38:05.668861" } STDOUT: bonding_masters eth0 lo 30529 1726882685.72135: no more pending results, returning what we have 30529 1726882685.72138: results queue empty 30529 1726882685.72139: checking for any_errors_fatal 30529 1726882685.72141: done checking for any_errors_fatal 30529 1726882685.72142: checking for max_fail_percentage 30529 1726882685.72143: done checking for max_fail_percentage 30529 1726882685.72144: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.72145: done checking to see if all hosts have failed 30529 1726882685.72146: getting the remaining hosts for this loop 30529 1726882685.72148: done getting the remaining hosts for this loop 30529 1726882685.72152: getting the next task for host managed_node1 30529 1726882685.72161: done getting next task for host managed_node1 30529 1726882685.72163: ^ task is: TASK: Set current_interfaces 30529 1726882685.72169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.72173: getting variables 30529 1726882685.72175: in VariableManager get_vars() 30529 1726882685.72223: Calling all_inventory to load vars for managed_node1 30529 1726882685.72226: Calling groups_inventory to load vars for managed_node1 30529 1726882685.72230: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.72319: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.72323: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.72326: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.72927: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002111 30529 1726882685.72930: WORKER PROCESS EXITING 30529 1726882685.73423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.74315: done with get_vars() 30529 1726882685.74330: done getting variables 30529 1726882685.74377: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:38:05 -0400 (0:00:00.372) 0:01:39.770 ****** 30529 1726882685.74403: entering _queue_task() for managed_node1/set_fact 30529 1726882685.74709: worker is 1 (out of 1 available) 30529 1726882685.74721: exiting _queue_task() for managed_node1/set_fact 30529 1726882685.74736: done queuing things up, now waiting for results queue to drain 30529 1726882685.74737: waiting for pending results... 30529 1726882685.75028: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 30529 1726882685.75158: in run() - task 12673a56-9f93-b0f1-edc0-000000002112 30529 1726882685.75172: variable 'ansible_search_path' from source: unknown 30529 1726882685.75176: variable 'ansible_search_path' from source: unknown 30529 1726882685.75215: calling self._execute() 30529 1726882685.75319: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.75323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.75342: variable 'omit' from source: magic vars 30529 1726882685.75770: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.75773: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.75776: variable 'omit' from source: magic vars 30529 1726882685.75819: variable 'omit' from source: magic vars 30529 1726882685.75936: variable '_current_interfaces' from source: set_fact 30529 1726882685.76012: variable 'omit' from source: magic vars 30529 1726882685.76099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882685.76103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.76110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882685.76129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.76141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.76172: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.76175: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.76178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.76311: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.76317: Set connection var ansible_pipelining to False 30529 1726882685.76320: Set connection var ansible_shell_type to sh 30529 1726882685.76322: Set connection var ansible_timeout to 10 30529 1726882685.76325: Set connection var ansible_connection to ssh 30529 1726882685.76327: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.76338: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.76341: variable 'ansible_connection' from source: unknown 30529 1726882685.76344: variable 'ansible_module_compression' from source: unknown 30529 1726882685.76346: variable 'ansible_shell_type' from source: unknown 30529 1726882685.76348: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.76350: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.76352: variable 'ansible_pipelining' from source: unknown 30529 1726882685.76418: variable 'ansible_timeout' from source: unknown 30529 1726882685.76423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.76525: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.76529: variable 'omit' from source: magic vars 30529 1726882685.76534: starting attempt loop 30529 1726882685.76631: running the handler 30529 1726882685.76634: handler run complete 30529 1726882685.76635: attempt loop complete, returning result 30529 1726882685.76637: _execute() done 30529 1726882685.76639: dumping result to json 30529 1726882685.76640: done dumping result, returning 30529 1726882685.76642: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-b0f1-edc0-000000002112] 30529 1726882685.76645: sending task result for task 12673a56-9f93-b0f1-edc0-000000002112 30529 1726882685.76707: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002112 30529 1726882685.76709: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 30529 1726882685.76761: no more pending results, returning what we have 30529 1726882685.76764: results queue empty 30529 1726882685.76765: checking for any_errors_fatal 30529 1726882685.76772: done checking for any_errors_fatal 30529 1726882685.76773: checking for max_fail_percentage 30529 1726882685.76774: done checking for max_fail_percentage 30529 1726882685.76775: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.76776: done checking to see if all hosts have failed 30529 1726882685.76777: getting the remaining hosts for this loop 30529 1726882685.76778: done getting the remaining hosts for this loop 30529 1726882685.76782: getting the next task for host managed_node1 30529 1726882685.76791: done getting next task for host managed_node1 30529 1726882685.76795: ^ task is: TASK: Show current_interfaces 30529 1726882685.76799: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.76802: getting variables 30529 1726882685.76803: in VariableManager get_vars() 30529 1726882685.76837: Calling all_inventory to load vars for managed_node1 30529 1726882685.76840: Calling groups_inventory to load vars for managed_node1 30529 1726882685.76844: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.76854: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.76856: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.76859: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.78886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.81852: done with get_vars() 30529 1726882685.81895: done getting variables 30529 1726882685.81954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:38:05 -0400 (0:00:00.075) 0:01:39.845 ****** 30529 1726882685.81998: entering _queue_task() for managed_node1/debug 30529 1726882685.82703: worker is 1 (out of 1 available) 30529 1726882685.82714: exiting _queue_task() for managed_node1/debug 30529 1726882685.82724: done queuing things up, now waiting for results queue to drain 30529 1726882685.82725: waiting for pending results... 30529 1726882685.83265: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 30529 1726882685.83280: in run() - task 12673a56-9f93-b0f1-edc0-0000000020d7 30529 1726882685.83296: variable 'ansible_search_path' from source: unknown 30529 1726882685.83300: variable 'ansible_search_path' from source: unknown 30529 1726882685.83421: calling self._execute() 30529 1726882685.83900: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.83903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.83906: variable 'omit' from source: magic vars 30529 1726882685.84160: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.84316: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.84500: variable 'omit' from source: magic vars 30529 1726882685.84503: variable 'omit' from source: magic vars 30529 1726882685.84684: variable 'current_interfaces' from source: set_fact 30529 1726882685.84772: variable 'omit' from source: magic vars 30529 1726882685.84889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882685.84936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882685.85022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882685.85043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.85114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882685.85149: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882685.85178: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.85205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.85434: Set connection var ansible_shell_executable to /bin/sh 30529 1726882685.85468: Set connection var ansible_pipelining to False 30529 1726882685.85514: Set connection var ansible_shell_type to sh 30529 1726882685.85546: Set connection var ansible_timeout to 10 30529 1726882685.85617: Set connection var ansible_connection to ssh 30529 1726882685.85630: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882685.85657: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.85677: variable 'ansible_connection' from source: unknown 30529 1726882685.85705: variable 'ansible_module_compression' from source: unknown 30529 1726882685.85750: variable 'ansible_shell_type' from source: unknown 30529 1726882685.85759: variable 'ansible_shell_executable' from source: unknown 30529 1726882685.85768: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.85777: variable 'ansible_pipelining' from source: unknown 30529 1726882685.85785: variable 'ansible_timeout' from source: unknown 30529 1726882685.85828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.86061: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882685.86080: variable 'omit' from source: magic vars 30529 1726882685.86091: starting attempt loop 30529 1726882685.86101: running the handler 30529 1726882685.86189: handler run complete 30529 1726882685.86262: attempt loop complete, returning result 30529 1726882685.86265: _execute() done 30529 1726882685.86267: dumping result to json 30529 1726882685.86269: done dumping result, returning 30529 1726882685.86272: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-b0f1-edc0-0000000020d7] 30529 1726882685.86279: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020d7 30529 1726882685.86349: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020d7 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 30529 1726882685.86438: no more pending results, returning what we have 30529 1726882685.86442: results queue empty 30529 1726882685.86443: checking for any_errors_fatal 30529 1726882685.86452: done checking for any_errors_fatal 30529 1726882685.86453: checking for max_fail_percentage 30529 1726882685.86455: done checking for max_fail_percentage 30529 1726882685.86456: checking to see if all hosts have failed and the running result is not ok 30529 1726882685.86457: done checking to see if all hosts have failed 30529 1726882685.86457: getting the remaining hosts for this loop 30529 1726882685.86459: done getting the remaining hosts for this loop 30529 1726882685.86463: getting the next task for host managed_node1 30529 1726882685.86476: done getting next task for host managed_node1 30529 1726882685.86480: ^ task is: TASK: Setup 30529 1726882685.86483: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882685.86489: getting variables 30529 1726882685.86491: in VariableManager get_vars() 30529 1726882685.86536: Calling all_inventory to load vars for managed_node1 30529 1726882685.86539: Calling groups_inventory to load vars for managed_node1 30529 1726882685.86543: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.86556: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.86560: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.86563: Calling groups_plugins_play to load vars for managed_node1 30529 1726882685.87206: WORKER PROCESS EXITING 30529 1726882685.88672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882685.91341: done with get_vars() 30529 1726882685.91370: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:38:05 -0400 (0:00:00.094) 0:01:39.940 ****** 30529 1726882685.91477: entering _queue_task() for managed_node1/include_tasks 30529 1726882685.91999: worker is 1 (out of 1 available) 30529 1726882685.92010: exiting _queue_task() for managed_node1/include_tasks 30529 1726882685.92023: done queuing things up, now waiting for results queue to drain 30529 1726882685.92024: waiting for pending results... 30529 1726882685.92312: running TaskExecutor() for managed_node1/TASK: Setup 30529 1726882685.92431: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b0 30529 1726882685.92434: variable 'ansible_search_path' from source: unknown 30529 1726882685.92438: variable 'ansible_search_path' from source: unknown 30529 1726882685.92477: variable 'lsr_setup' from source: include params 30529 1726882685.93200: variable 'lsr_setup' from source: include params 30529 1726882685.93203: variable 'omit' from source: magic vars 30529 1726882685.93628: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.93637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.93646: variable 'omit' from source: magic vars 30529 1726882685.94488: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.94590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.94603: variable 'item' from source: unknown 30529 1726882685.94667: variable 'item' from source: unknown 30529 1726882685.94708: variable 'item' from source: unknown 30529 1726882685.94963: variable 'item' from source: unknown 30529 1726882685.95579: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.95583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.95586: variable 'omit' from source: magic vars 30529 1726882685.95848: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.95853: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.95859: variable 'item' from source: unknown 30529 1726882685.96103: variable 'item' from source: unknown 30529 1726882685.96133: variable 'item' from source: unknown 30529 1726882685.96701: variable 'item' from source: unknown 30529 1726882685.96760: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882685.96763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882685.96765: variable 'omit' from source: magic vars 30529 1726882685.97024: variable 'ansible_distribution_major_version' from source: facts 30529 1726882685.97028: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882685.97031: variable 'item' from source: unknown 30529 1726882685.97140: variable 'item' from source: unknown 30529 1726882685.97169: variable 'item' from source: unknown 30529 1726882685.97308: variable 'item' from source: unknown 30529 1726882685.97376: dumping result to json 30529 1726882685.97380: done dumping result, returning 30529 1726882685.97383: done running TaskExecutor() for managed_node1/TASK: Setup [12673a56-9f93-b0f1-edc0-0000000020b0] 30529 1726882685.97387: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b0 30529 1726882685.97429: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b0 30529 1726882685.97433: WORKER PROCESS EXITING 30529 1726882685.97466: no more pending results, returning what we have 30529 1726882685.97472: in VariableManager get_vars() 30529 1726882685.97527: Calling all_inventory to load vars for managed_node1 30529 1726882685.97530: Calling groups_inventory to load vars for managed_node1 30529 1726882685.97534: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882685.97548: Calling all_plugins_play to load vars for managed_node1 30529 1726882685.97552: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882685.97555: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.01766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.05244: done with get_vars() 30529 1726882686.05275: variable 'ansible_search_path' from source: unknown 30529 1726882686.05276: variable 'ansible_search_path' from source: unknown 30529 1726882686.05426: variable 'ansible_search_path' from source: unknown 30529 1726882686.05428: variable 'ansible_search_path' from source: unknown 30529 1726882686.05455: variable 'ansible_search_path' from source: unknown 30529 1726882686.05456: variable 'ansible_search_path' from source: unknown 30529 1726882686.05483: we have included files to process 30529 1726882686.05484: generating all_blocks data 30529 1726882686.05486: done generating all_blocks data 30529 1726882686.05616: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882686.05618: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882686.05622: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30529 1726882686.06356: done processing included file 30529 1726882686.06359: iterating over new_blocks loaded from include file 30529 1726882686.06360: in VariableManager get_vars() 30529 1726882686.06379: done with get_vars() 30529 1726882686.06381: filtering new block on tags 30529 1726882686.06557: done filtering new block on tags 30529 1726882686.06560: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node1 => (item=tasks/create_bridge_profile.yml) 30529 1726882686.06565: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882686.06566: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882686.06569: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30529 1726882686.06933: done processing included file 30529 1726882686.06935: iterating over new_blocks loaded from include file 30529 1726882686.06937: in VariableManager get_vars() 30529 1726882686.07069: done with get_vars() 30529 1726882686.07071: filtering new block on tags 30529 1726882686.07098: done filtering new block on tags 30529 1726882686.07101: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node1 => (item=tasks/activate_profile.yml) 30529 1726882686.07105: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882686.07106: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882686.07109: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882686.07637: done processing included file 30529 1726882686.07639: iterating over new_blocks loaded from include file 30529 1726882686.07641: in VariableManager get_vars() 30529 1726882686.07657: done with get_vars() 30529 1726882686.07659: filtering new block on tags 30529 1726882686.07681: done filtering new block on tags 30529 1726882686.07683: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node1 => (item=tasks/remove+down_profile.yml) 30529 1726882686.07686: extending task lists for all hosts with included blocks 30529 1726882686.10538: done extending task lists 30529 1726882686.10540: done processing included files 30529 1726882686.10540: results queue empty 30529 1726882686.10541: checking for any_errors_fatal 30529 1726882686.10545: done checking for any_errors_fatal 30529 1726882686.10546: checking for max_fail_percentage 30529 1726882686.10547: done checking for max_fail_percentage 30529 1726882686.10548: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.10548: done checking to see if all hosts have failed 30529 1726882686.10549: getting the remaining hosts for this loop 30529 1726882686.10551: done getting the remaining hosts for this loop 30529 1726882686.10553: getting the next task for host managed_node1 30529 1726882686.10558: done getting next task for host managed_node1 30529 1726882686.10560: ^ task is: TASK: Include network role 30529 1726882686.10563: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.10565: getting variables 30529 1726882686.10566: in VariableManager get_vars() 30529 1726882686.10578: Calling all_inventory to load vars for managed_node1 30529 1726882686.10581: Calling groups_inventory to load vars for managed_node1 30529 1726882686.10583: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.10591: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.10596: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.10599: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.13471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.16686: done with get_vars() 30529 1726882686.16717: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:38:06 -0400 (0:00:00.254) 0:01:40.195 ****** 30529 1726882686.16960: entering _queue_task() for managed_node1/include_role 30529 1726882686.17854: worker is 1 (out of 1 available) 30529 1726882686.17864: exiting _queue_task() for managed_node1/include_role 30529 1726882686.17875: done queuing things up, now waiting for results queue to drain 30529 1726882686.17877: waiting for pending results... 30529 1726882686.18541: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882686.18546: in run() - task 12673a56-9f93-b0f1-edc0-000000002139 30529 1726882686.18549: variable 'ansible_search_path' from source: unknown 30529 1726882686.18552: variable 'ansible_search_path' from source: unknown 30529 1726882686.18721: calling self._execute() 30529 1726882686.19017: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.19021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.19035: variable 'omit' from source: magic vars 30529 1726882686.19808: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.19819: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.19825: _execute() done 30529 1726882686.19828: dumping result to json 30529 1726882686.19830: done dumping result, returning 30529 1726882686.19838: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000002139] 30529 1726882686.19843: sending task result for task 12673a56-9f93-b0f1-edc0-000000002139 30529 1726882686.19987: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002139 30529 1726882686.19992: WORKER PROCESS EXITING 30529 1726882686.20026: no more pending results, returning what we have 30529 1726882686.20033: in VariableManager get_vars() 30529 1726882686.20081: Calling all_inventory to load vars for managed_node1 30529 1726882686.20084: Calling groups_inventory to load vars for managed_node1 30529 1726882686.20091: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.20110: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.20114: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.20117: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.22646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.24482: done with get_vars() 30529 1726882686.24508: variable 'ansible_search_path' from source: unknown 30529 1726882686.24509: variable 'ansible_search_path' from source: unknown 30529 1726882686.24718: variable 'omit' from source: magic vars 30529 1726882686.24760: variable 'omit' from source: magic vars 30529 1726882686.24775: variable 'omit' from source: magic vars 30529 1726882686.24779: we have included files to process 30529 1726882686.24780: generating all_blocks data 30529 1726882686.24782: done generating all_blocks data 30529 1726882686.24783: processing included file: fedora.linux_system_roles.network 30529 1726882686.24812: in VariableManager get_vars() 30529 1726882686.24830: done with get_vars() 30529 1726882686.24858: in VariableManager get_vars() 30529 1726882686.24877: done with get_vars() 30529 1726882686.24924: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882686.25048: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882686.25135: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882686.25613: in VariableManager get_vars() 30529 1726882686.25635: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882686.28822: iterating over new_blocks loaded from include file 30529 1726882686.28824: in VariableManager get_vars() 30529 1726882686.28838: done with get_vars() 30529 1726882686.28839: filtering new block on tags 30529 1726882686.29077: done filtering new block on tags 30529 1726882686.29080: in VariableManager get_vars() 30529 1726882686.29095: done with get_vars() 30529 1726882686.29097: filtering new block on tags 30529 1726882686.29107: done filtering new block on tags 30529 1726882686.29108: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882686.29120: extending task lists for all hosts with included blocks 30529 1726882686.29261: done extending task lists 30529 1726882686.29263: done processing included files 30529 1726882686.29264: results queue empty 30529 1726882686.29264: checking for any_errors_fatal 30529 1726882686.29267: done checking for any_errors_fatal 30529 1726882686.29268: checking for max_fail_percentage 30529 1726882686.29269: done checking for max_fail_percentage 30529 1726882686.29270: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.29271: done checking to see if all hosts have failed 30529 1726882686.29271: getting the remaining hosts for this loop 30529 1726882686.29273: done getting the remaining hosts for this loop 30529 1726882686.29276: getting the next task for host managed_node1 30529 1726882686.29281: done getting next task for host managed_node1 30529 1726882686.29284: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882686.29290: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.29303: getting variables 30529 1726882686.29305: in VariableManager get_vars() 30529 1726882686.29322: Calling all_inventory to load vars for managed_node1 30529 1726882686.29324: Calling groups_inventory to load vars for managed_node1 30529 1726882686.29326: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.29335: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.29338: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.29340: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.30836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.33165: done with get_vars() 30529 1726882686.33198: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:06 -0400 (0:00:00.163) 0:01:40.358 ****** 30529 1726882686.33295: entering _queue_task() for managed_node1/include_tasks 30529 1726882686.33623: worker is 1 (out of 1 available) 30529 1726882686.33635: exiting _queue_task() for managed_node1/include_tasks 30529 1726882686.33647: done queuing things up, now waiting for results queue to drain 30529 1726882686.33648: waiting for pending results... 30529 1726882686.33846: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882686.33944: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a3 30529 1726882686.33957: variable 'ansible_search_path' from source: unknown 30529 1726882686.33961: variable 'ansible_search_path' from source: unknown 30529 1726882686.33991: calling self._execute() 30529 1726882686.34067: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.34070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.34079: variable 'omit' from source: magic vars 30529 1726882686.34411: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.34416: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.34419: _execute() done 30529 1726882686.34421: dumping result to json 30529 1726882686.34423: done dumping result, returning 30529 1726882686.34425: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-0000000021a3] 30529 1726882686.34427: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a3 30529 1726882686.34492: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a3 30529 1726882686.34497: WORKER PROCESS EXITING 30529 1726882686.34541: no more pending results, returning what we have 30529 1726882686.34546: in VariableManager get_vars() 30529 1726882686.34597: Calling all_inventory to load vars for managed_node1 30529 1726882686.34600: Calling groups_inventory to load vars for managed_node1 30529 1726882686.34602: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.34614: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.34616: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.34619: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.35836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.37176: done with get_vars() 30529 1726882686.37198: variable 'ansible_search_path' from source: unknown 30529 1726882686.37200: variable 'ansible_search_path' from source: unknown 30529 1726882686.37229: we have included files to process 30529 1726882686.37230: generating all_blocks data 30529 1726882686.37231: done generating all_blocks data 30529 1726882686.37233: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882686.37234: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882686.37235: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882686.37614: done processing included file 30529 1726882686.37616: iterating over new_blocks loaded from include file 30529 1726882686.37617: in VariableManager get_vars() 30529 1726882686.37633: done with get_vars() 30529 1726882686.37634: filtering new block on tags 30529 1726882686.37657: done filtering new block on tags 30529 1726882686.37659: in VariableManager get_vars() 30529 1726882686.37673: done with get_vars() 30529 1726882686.37674: filtering new block on tags 30529 1726882686.37702: done filtering new block on tags 30529 1726882686.37704: in VariableManager get_vars() 30529 1726882686.37719: done with get_vars() 30529 1726882686.37720: filtering new block on tags 30529 1726882686.37743: done filtering new block on tags 30529 1726882686.37744: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882686.37749: extending task lists for all hosts with included blocks 30529 1726882686.38702: done extending task lists 30529 1726882686.38703: done processing included files 30529 1726882686.38703: results queue empty 30529 1726882686.38704: checking for any_errors_fatal 30529 1726882686.38706: done checking for any_errors_fatal 30529 1726882686.38707: checking for max_fail_percentage 30529 1726882686.38707: done checking for max_fail_percentage 30529 1726882686.38708: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.38708: done checking to see if all hosts have failed 30529 1726882686.38709: getting the remaining hosts for this loop 30529 1726882686.38710: done getting the remaining hosts for this loop 30529 1726882686.38711: getting the next task for host managed_node1 30529 1726882686.38715: done getting next task for host managed_node1 30529 1726882686.38716: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882686.38719: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.38727: getting variables 30529 1726882686.38728: in VariableManager get_vars() 30529 1726882686.38739: Calling all_inventory to load vars for managed_node1 30529 1726882686.38741: Calling groups_inventory to load vars for managed_node1 30529 1726882686.38742: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.38745: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.38747: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.38749: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.39645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.40681: done with get_vars() 30529 1726882686.40712: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:06 -0400 (0:00:00.074) 0:01:40.433 ****** 30529 1726882686.40769: entering _queue_task() for managed_node1/setup 30529 1726882686.41122: worker is 1 (out of 1 available) 30529 1726882686.41141: exiting _queue_task() for managed_node1/setup 30529 1726882686.41154: done queuing things up, now waiting for results queue to drain 30529 1726882686.41155: waiting for pending results... 30529 1726882686.41385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882686.41552: in run() - task 12673a56-9f93-b0f1-edc0-000000002200 30529 1726882686.41556: variable 'ansible_search_path' from source: unknown 30529 1726882686.41559: variable 'ansible_search_path' from source: unknown 30529 1726882686.41601: calling self._execute() 30529 1726882686.41677: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.41681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.41720: variable 'omit' from source: magic vars 30529 1726882686.42123: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.42142: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.42369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882686.44271: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882686.44337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882686.44389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882686.44425: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882686.44448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882686.44510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882686.44531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882686.44551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882686.44578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882686.44588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882686.44629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882686.44647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882686.44667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882686.44704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882686.44717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882686.44825: variable '__network_required_facts' from source: role '' defaults 30529 1726882686.44832: variable 'ansible_facts' from source: unknown 30529 1726882686.45371: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882686.45375: when evaluation is False, skipping this task 30529 1726882686.45378: _execute() done 30529 1726882686.45381: dumping result to json 30529 1726882686.45383: done dumping result, returning 30529 1726882686.45392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-000000002200] 30529 1726882686.45398: sending task result for task 12673a56-9f93-b0f1-edc0-000000002200 30529 1726882686.45495: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002200 30529 1726882686.45498: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882686.45545: no more pending results, returning what we have 30529 1726882686.45548: results queue empty 30529 1726882686.45550: checking for any_errors_fatal 30529 1726882686.45551: done checking for any_errors_fatal 30529 1726882686.45552: checking for max_fail_percentage 30529 1726882686.45554: done checking for max_fail_percentage 30529 1726882686.45555: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.45555: done checking to see if all hosts have failed 30529 1726882686.45556: getting the remaining hosts for this loop 30529 1726882686.45558: done getting the remaining hosts for this loop 30529 1726882686.45561: getting the next task for host managed_node1 30529 1726882686.45572: done getting next task for host managed_node1 30529 1726882686.45575: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882686.45581: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.45603: getting variables 30529 1726882686.45605: in VariableManager get_vars() 30529 1726882686.45650: Calling all_inventory to load vars for managed_node1 30529 1726882686.45653: Calling groups_inventory to load vars for managed_node1 30529 1726882686.45655: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.45665: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.45667: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.45675: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.46625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.47513: done with get_vars() 30529 1726882686.47530: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:06 -0400 (0:00:00.068) 0:01:40.502 ****** 30529 1726882686.47604: entering _queue_task() for managed_node1/stat 30529 1726882686.47845: worker is 1 (out of 1 available) 30529 1726882686.47858: exiting _queue_task() for managed_node1/stat 30529 1726882686.47870: done queuing things up, now waiting for results queue to drain 30529 1726882686.47871: waiting for pending results... 30529 1726882686.48097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882686.48212: in run() - task 12673a56-9f93-b0f1-edc0-000000002202 30529 1726882686.48228: variable 'ansible_search_path' from source: unknown 30529 1726882686.48241: variable 'ansible_search_path' from source: unknown 30529 1726882686.48257: calling self._execute() 30529 1726882686.48328: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.48333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.48342: variable 'omit' from source: magic vars 30529 1726882686.48617: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.48627: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.48742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882686.48939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882686.48971: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882686.49002: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882686.49027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882686.49089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882686.49114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882686.49132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882686.49149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882686.49222: variable '__network_is_ostree' from source: set_fact 30529 1726882686.49228: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882686.49231: when evaluation is False, skipping this task 30529 1726882686.49233: _execute() done 30529 1726882686.49236: dumping result to json 30529 1726882686.49239: done dumping result, returning 30529 1726882686.49246: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000002202] 30529 1726882686.49251: sending task result for task 12673a56-9f93-b0f1-edc0-000000002202 30529 1726882686.49335: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002202 30529 1726882686.49337: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882686.49388: no more pending results, returning what we have 30529 1726882686.49392: results queue empty 30529 1726882686.49395: checking for any_errors_fatal 30529 1726882686.49406: done checking for any_errors_fatal 30529 1726882686.49406: checking for max_fail_percentage 30529 1726882686.49408: done checking for max_fail_percentage 30529 1726882686.49409: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.49410: done checking to see if all hosts have failed 30529 1726882686.49411: getting the remaining hosts for this loop 30529 1726882686.49413: done getting the remaining hosts for this loop 30529 1726882686.49416: getting the next task for host managed_node1 30529 1726882686.49424: done getting next task for host managed_node1 30529 1726882686.49427: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882686.49433: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.49453: getting variables 30529 1726882686.49455: in VariableManager get_vars() 30529 1726882686.49492: Calling all_inventory to load vars for managed_node1 30529 1726882686.49496: Calling groups_inventory to load vars for managed_node1 30529 1726882686.49499: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.49508: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.49510: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.49513: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.50943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.52255: done with get_vars() 30529 1726882686.52283: done getting variables 30529 1726882686.52357: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:06 -0400 (0:00:00.047) 0:01:40.550 ****** 30529 1726882686.52403: entering _queue_task() for managed_node1/set_fact 30529 1726882686.52764: worker is 1 (out of 1 available) 30529 1726882686.52777: exiting _queue_task() for managed_node1/set_fact 30529 1726882686.52797: done queuing things up, now waiting for results queue to drain 30529 1726882686.52799: waiting for pending results... 30529 1726882686.53109: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882686.53218: in run() - task 12673a56-9f93-b0f1-edc0-000000002203 30529 1726882686.53230: variable 'ansible_search_path' from source: unknown 30529 1726882686.53234: variable 'ansible_search_path' from source: unknown 30529 1726882686.53286: calling self._execute() 30529 1726882686.53384: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.53388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.53417: variable 'omit' from source: magic vars 30529 1726882686.53785: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.53820: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.53994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882686.54221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882686.54288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882686.54332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882686.54365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882686.54439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882686.54464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882686.54479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882686.54513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882686.54591: variable '__network_is_ostree' from source: set_fact 30529 1726882686.54611: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882686.54615: when evaluation is False, skipping this task 30529 1726882686.54617: _execute() done 30529 1726882686.54620: dumping result to json 30529 1726882686.54622: done dumping result, returning 30529 1726882686.54625: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000002203] 30529 1726882686.54627: sending task result for task 12673a56-9f93-b0f1-edc0-000000002203 30529 1726882686.54713: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002203 30529 1726882686.54716: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882686.54761: no more pending results, returning what we have 30529 1726882686.54765: results queue empty 30529 1726882686.54766: checking for any_errors_fatal 30529 1726882686.54776: done checking for any_errors_fatal 30529 1726882686.54777: checking for max_fail_percentage 30529 1726882686.54780: done checking for max_fail_percentage 30529 1726882686.54781: checking to see if all hosts have failed and the running result is not ok 30529 1726882686.54782: done checking to see if all hosts have failed 30529 1726882686.54783: getting the remaining hosts for this loop 30529 1726882686.54784: done getting the remaining hosts for this loop 30529 1726882686.54788: getting the next task for host managed_node1 30529 1726882686.54829: done getting next task for host managed_node1 30529 1726882686.54833: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882686.54838: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882686.54856: getting variables 30529 1726882686.54858: in VariableManager get_vars() 30529 1726882686.54899: Calling all_inventory to load vars for managed_node1 30529 1726882686.54915: Calling groups_inventory to load vars for managed_node1 30529 1726882686.54918: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882686.54927: Calling all_plugins_play to load vars for managed_node1 30529 1726882686.54930: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882686.54933: Calling groups_plugins_play to load vars for managed_node1 30529 1726882686.55762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882686.56901: done with get_vars() 30529 1726882686.56917: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:06 -0400 (0:00:00.046) 0:01:40.596 ****** 30529 1726882686.57011: entering _queue_task() for managed_node1/service_facts 30529 1726882686.57338: worker is 1 (out of 1 available) 30529 1726882686.57351: exiting _queue_task() for managed_node1/service_facts 30529 1726882686.57365: done queuing things up, now waiting for results queue to drain 30529 1726882686.57367: waiting for pending results... 30529 1726882686.57563: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882686.57673: in run() - task 12673a56-9f93-b0f1-edc0-000000002205 30529 1726882686.57685: variable 'ansible_search_path' from source: unknown 30529 1726882686.57700: variable 'ansible_search_path' from source: unknown 30529 1726882686.57732: calling self._execute() 30529 1726882686.57802: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.57806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.57815: variable 'omit' from source: magic vars 30529 1726882686.58090: variable 'ansible_distribution_major_version' from source: facts 30529 1726882686.58100: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882686.58105: variable 'omit' from source: magic vars 30529 1726882686.58163: variable 'omit' from source: magic vars 30529 1726882686.58184: variable 'omit' from source: magic vars 30529 1726882686.58217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882686.58243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882686.58261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882686.58275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882686.58286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882686.58312: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882686.58315: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.58318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.58400: Set connection var ansible_shell_executable to /bin/sh 30529 1726882686.58404: Set connection var ansible_pipelining to False 30529 1726882686.58406: Set connection var ansible_shell_type to sh 30529 1726882686.58418: Set connection var ansible_timeout to 10 30529 1726882686.58420: Set connection var ansible_connection to ssh 30529 1726882686.58432: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882686.58444: variable 'ansible_shell_executable' from source: unknown 30529 1726882686.58447: variable 'ansible_connection' from source: unknown 30529 1726882686.58449: variable 'ansible_module_compression' from source: unknown 30529 1726882686.58452: variable 'ansible_shell_type' from source: unknown 30529 1726882686.58454: variable 'ansible_shell_executable' from source: unknown 30529 1726882686.58456: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882686.58460: variable 'ansible_pipelining' from source: unknown 30529 1726882686.58463: variable 'ansible_timeout' from source: unknown 30529 1726882686.58467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882686.58609: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882686.58618: variable 'omit' from source: magic vars 30529 1726882686.58623: starting attempt loop 30529 1726882686.58626: running the handler 30529 1726882686.58639: _low_level_execute_command(): starting 30529 1726882686.58645: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882686.59189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882686.59195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.59198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882686.59201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.59259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882686.59278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882686.59331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882686.61025: stdout chunk (state=3): >>>/root <<< 30529 1726882686.61125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882686.61163: stderr chunk (state=3): >>><<< 30529 1726882686.61167: stdout chunk (state=3): >>><<< 30529 1726882686.61187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882686.61212: _low_level_execute_command(): starting 30529 1726882686.61216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279 `" && echo ansible-tmp-1726882686.611883-35278-48276373683279="` echo /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279 `" ) && sleep 0' 30529 1726882686.61751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882686.61755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882686.61771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882686.61775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.61827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882686.61872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882686.63726: stdout chunk (state=3): >>>ansible-tmp-1726882686.611883-35278-48276373683279=/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279 <<< 30529 1726882686.63900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882686.63904: stdout chunk (state=3): >>><<< 30529 1726882686.63907: stderr chunk (state=3): >>><<< 30529 1726882686.64003: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882686.611883-35278-48276373683279=/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882686.64008: variable 'ansible_module_compression' from source: unknown 30529 1726882686.64066: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882686.64164: variable 'ansible_facts' from source: unknown 30529 1726882686.64402: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py 30529 1726882686.64579: Sending initial data 30529 1726882686.64583: Sent initial data (160 bytes) 30529 1726882686.65172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882686.65178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882686.65198: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882686.65202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.65297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.65302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882686.65315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882686.65343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882686.65380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882686.66901: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882686.66954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882686.67084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvyazluzr /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py <<< 30529 1726882686.67090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpvyazluzr" to remote "/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py" <<< 30529 1726882686.67877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882686.67921: stderr chunk (state=3): >>><<< 30529 1726882686.67924: stdout chunk (state=3): >>><<< 30529 1726882686.67939: done transferring module to remote 30529 1726882686.67960: _low_level_execute_command(): starting 30529 1726882686.67963: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/ /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py && sleep 0' 30529 1726882686.68462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.68466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882686.68474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.68540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882686.68544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882686.68606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882686.70303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882686.70324: stderr chunk (state=3): >>><<< 30529 1726882686.70329: stdout chunk (state=3): >>><<< 30529 1726882686.70347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882686.70350: _low_level_execute_command(): starting 30529 1726882686.70353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/AnsiballZ_service_facts.py && sleep 0' 30529 1726882686.70973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882686.70990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882686.71103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882686.71309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.21724: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882688.21754: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30529 1726882688.21776: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882688.23244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882688.23273: stderr chunk (state=3): >>><<< 30529 1726882688.23277: stdout chunk (state=3): >>><<< 30529 1726882688.23304: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882688.23765: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882688.23773: _low_level_execute_command(): starting 30529 1726882688.23779: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882686.611883-35278-48276373683279/ > /dev/null 2>&1 && sleep 0' 30529 1726882688.24251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882688.24254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882688.24256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.24259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.24262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.24312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882688.24315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.24328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.24371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.26131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.26155: stderr chunk (state=3): >>><<< 30529 1726882688.26158: stdout chunk (state=3): >>><<< 30529 1726882688.26170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882688.26176: handler run complete 30529 1726882688.26296: variable 'ansible_facts' from source: unknown 30529 1726882688.26389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882688.26680: variable 'ansible_facts' from source: unknown 30529 1726882688.26758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882688.26875: attempt loop complete, returning result 30529 1726882688.26878: _execute() done 30529 1726882688.26881: dumping result to json 30529 1726882688.26921: done dumping result, returning 30529 1726882688.26929: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000002205] 30529 1726882688.26933: sending task result for task 12673a56-9f93-b0f1-edc0-000000002205 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882688.27568: no more pending results, returning what we have 30529 1726882688.27571: results queue empty 30529 1726882688.27571: checking for any_errors_fatal 30529 1726882688.27575: done checking for any_errors_fatal 30529 1726882688.27576: checking for max_fail_percentage 30529 1726882688.27578: done checking for max_fail_percentage 30529 1726882688.27578: checking to see if all hosts have failed and the running result is not ok 30529 1726882688.27579: done checking to see if all hosts have failed 30529 1726882688.27580: getting the remaining hosts for this loop 30529 1726882688.27581: done getting the remaining hosts for this loop 30529 1726882688.27584: getting the next task for host managed_node1 30529 1726882688.27592: done getting next task for host managed_node1 30529 1726882688.27596: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882688.27601: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882688.27611: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002205 30529 1726882688.27616: WORKER PROCESS EXITING 30529 1726882688.27623: getting variables 30529 1726882688.27624: in VariableManager get_vars() 30529 1726882688.27651: Calling all_inventory to load vars for managed_node1 30529 1726882688.27653: Calling groups_inventory to load vars for managed_node1 30529 1726882688.27654: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882688.27660: Calling all_plugins_play to load vars for managed_node1 30529 1726882688.27662: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882688.27668: Calling groups_plugins_play to load vars for managed_node1 30529 1726882688.28498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882688.29367: done with get_vars() 30529 1726882688.29383: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:08 -0400 (0:00:01.724) 0:01:42.320 ****** 30529 1726882688.29454: entering _queue_task() for managed_node1/package_facts 30529 1726882688.29681: worker is 1 (out of 1 available) 30529 1726882688.29696: exiting _queue_task() for managed_node1/package_facts 30529 1726882688.29710: done queuing things up, now waiting for results queue to drain 30529 1726882688.29712: waiting for pending results... 30529 1726882688.29901: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882688.29999: in run() - task 12673a56-9f93-b0f1-edc0-000000002206 30529 1726882688.30011: variable 'ansible_search_path' from source: unknown 30529 1726882688.30016: variable 'ansible_search_path' from source: unknown 30529 1726882688.30048: calling self._execute() 30529 1726882688.30119: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882688.30123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882688.30131: variable 'omit' from source: magic vars 30529 1726882688.30416: variable 'ansible_distribution_major_version' from source: facts 30529 1726882688.30425: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882688.30431: variable 'omit' from source: magic vars 30529 1726882688.30486: variable 'omit' from source: magic vars 30529 1726882688.30513: variable 'omit' from source: magic vars 30529 1726882688.30544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882688.30571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882688.30588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882688.30608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882688.30618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882688.30643: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882688.30646: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882688.30649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882688.30726: Set connection var ansible_shell_executable to /bin/sh 30529 1726882688.30729: Set connection var ansible_pipelining to False 30529 1726882688.30732: Set connection var ansible_shell_type to sh 30529 1726882688.30740: Set connection var ansible_timeout to 10 30529 1726882688.30742: Set connection var ansible_connection to ssh 30529 1726882688.30747: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882688.30764: variable 'ansible_shell_executable' from source: unknown 30529 1726882688.30767: variable 'ansible_connection' from source: unknown 30529 1726882688.30770: variable 'ansible_module_compression' from source: unknown 30529 1726882688.30772: variable 'ansible_shell_type' from source: unknown 30529 1726882688.30774: variable 'ansible_shell_executable' from source: unknown 30529 1726882688.30777: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882688.30779: variable 'ansible_pipelining' from source: unknown 30529 1726882688.30783: variable 'ansible_timeout' from source: unknown 30529 1726882688.30787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882688.30938: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882688.30947: variable 'omit' from source: magic vars 30529 1726882688.30952: starting attempt loop 30529 1726882688.30955: running the handler 30529 1726882688.30968: _low_level_execute_command(): starting 30529 1726882688.30974: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882688.31477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.31509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.31513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882688.31516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.31567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882688.31570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.31577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.31621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.33172: stdout chunk (state=3): >>>/root <<< 30529 1726882688.33272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.33304: stderr chunk (state=3): >>><<< 30529 1726882688.33307: stdout chunk (state=3): >>><<< 30529 1726882688.33329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882688.33339: _low_level_execute_command(): starting 30529 1726882688.33349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863 `" && echo ansible-tmp-1726882688.3332784-35342-229660897156863="` echo /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863 `" ) && sleep 0' 30529 1726882688.33790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.33805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.33807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.33849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.33856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.33902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.35731: stdout chunk (state=3): >>>ansible-tmp-1726882688.3332784-35342-229660897156863=/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863 <<< 30529 1726882688.35832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.35854: stderr chunk (state=3): >>><<< 30529 1726882688.35857: stdout chunk (state=3): >>><<< 30529 1726882688.35870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882688.3332784-35342-229660897156863=/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882688.35915: variable 'ansible_module_compression' from source: unknown 30529 1726882688.35950: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882688.35997: variable 'ansible_facts' from source: unknown 30529 1726882688.36115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py 30529 1726882688.36213: Sending initial data 30529 1726882688.36217: Sent initial data (162 bytes) 30529 1726882688.36644: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882688.36647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882688.36649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882688.36653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882688.36656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.36699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882688.36718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.36720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.36758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.38262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882688.38321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882688.38367: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp6n927jzv /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py <<< 30529 1726882688.38370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py" <<< 30529 1726882688.38420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp6n927jzv" to remote "/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py" <<< 30529 1726882688.39935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.39938: stdout chunk (state=3): >>><<< 30529 1726882688.39940: stderr chunk (state=3): >>><<< 30529 1726882688.39949: done transferring module to remote 30529 1726882688.39963: _low_level_execute_command(): starting 30529 1726882688.39972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/ /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py && sleep 0' 30529 1726882688.40566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882688.40584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882688.40703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.40725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882688.40745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.40765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.40847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.42597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.42615: stdout chunk (state=3): >>><<< 30529 1726882688.42626: stderr chunk (state=3): >>><<< 30529 1726882688.42643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882688.42650: _low_level_execute_command(): starting 30529 1726882688.42658: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/AnsiballZ_package_facts.py && sleep 0' 30529 1726882688.43307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882688.43349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.43363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882688.43449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882688.43464: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882688.43478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.43499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.43585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.87268: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30529 1726882688.87445: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882688.89164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.89196: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 30529 1726882688.89200: stdout chunk (state=3): >>><<< 30529 1726882688.89202: stderr chunk (state=3): >>><<< 30529 1726882688.89406: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882688.92222: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882688.92250: _low_level_execute_command(): starting 30529 1726882688.92261: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882688.3332784-35342-229660897156863/ > /dev/null 2>&1 && sleep 0' 30529 1726882688.92825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882688.92837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882688.92851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882688.92866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882688.92880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882688.92891: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882688.92907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882688.92923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882688.93010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882688.93035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882688.93111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882688.94978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882688.94987: stdout chunk (state=3): >>><<< 30529 1726882688.94999: stderr chunk (state=3): >>><<< 30529 1726882688.95015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882688.95024: handler run complete 30529 1726882688.95802: variable 'ansible_facts' from source: unknown 30529 1726882688.96246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882688.97992: variable 'ansible_facts' from source: unknown 30529 1726882688.98270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882688.98651: attempt loop complete, returning result 30529 1726882688.98659: _execute() done 30529 1726882688.98662: dumping result to json 30529 1726882688.98776: done dumping result, returning 30529 1726882688.98785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000002206] 30529 1726882688.98791: sending task result for task 12673a56-9f93-b0f1-edc0-000000002206 30529 1726882689.06550: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002206 30529 1726882689.06554: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882689.06732: no more pending results, returning what we have 30529 1726882689.06735: results queue empty 30529 1726882689.06736: checking for any_errors_fatal 30529 1726882689.06744: done checking for any_errors_fatal 30529 1726882689.06745: checking for max_fail_percentage 30529 1726882689.06747: done checking for max_fail_percentage 30529 1726882689.06747: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.06748: done checking to see if all hosts have failed 30529 1726882689.06749: getting the remaining hosts for this loop 30529 1726882689.06750: done getting the remaining hosts for this loop 30529 1726882689.06754: getting the next task for host managed_node1 30529 1726882689.06762: done getting next task for host managed_node1 30529 1726882689.06765: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882689.06771: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.06798: getting variables 30529 1726882689.06800: in VariableManager get_vars() 30529 1726882689.06835: Calling all_inventory to load vars for managed_node1 30529 1726882689.06838: Calling groups_inventory to load vars for managed_node1 30529 1726882689.06841: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.06849: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.06853: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.06856: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.08202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.10071: done with get_vars() 30529 1726882689.10101: done getting variables 30529 1726882689.10172: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:09 -0400 (0:00:00.807) 0:01:43.128 ****** 30529 1726882689.10215: entering _queue_task() for managed_node1/debug 30529 1726882689.10725: worker is 1 (out of 1 available) 30529 1726882689.10738: exiting _queue_task() for managed_node1/debug 30529 1726882689.10750: done queuing things up, now waiting for results queue to drain 30529 1726882689.10752: waiting for pending results... 30529 1726882689.11114: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882689.11186: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a4 30529 1726882689.11215: variable 'ansible_search_path' from source: unknown 30529 1726882689.11223: variable 'ansible_search_path' from source: unknown 30529 1726882689.11286: calling self._execute() 30529 1726882689.11410: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.11472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.11476: variable 'omit' from source: magic vars 30529 1726882689.11891: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.11923: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.11936: variable 'omit' from source: magic vars 30529 1726882689.12007: variable 'omit' from source: magic vars 30529 1726882689.12143: variable 'network_provider' from source: set_fact 30529 1726882689.12198: variable 'omit' from source: magic vars 30529 1726882689.12218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882689.12270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882689.12302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882689.12327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882689.12359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882689.12456: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882689.12460: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.12462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.12564: Set connection var ansible_shell_executable to /bin/sh 30529 1726882689.12568: Set connection var ansible_pipelining to False 30529 1726882689.12570: Set connection var ansible_shell_type to sh 30529 1726882689.12595: Set connection var ansible_timeout to 10 30529 1726882689.12671: Set connection var ansible_connection to ssh 30529 1726882689.12674: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882689.12677: variable 'ansible_shell_executable' from source: unknown 30529 1726882689.12679: variable 'ansible_connection' from source: unknown 30529 1726882689.12683: variable 'ansible_module_compression' from source: unknown 30529 1726882689.12685: variable 'ansible_shell_type' from source: unknown 30529 1726882689.12690: variable 'ansible_shell_executable' from source: unknown 30529 1726882689.12694: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.12696: variable 'ansible_pipelining' from source: unknown 30529 1726882689.12698: variable 'ansible_timeout' from source: unknown 30529 1726882689.12700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.12899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882689.12903: variable 'omit' from source: magic vars 30529 1726882689.12905: starting attempt loop 30529 1726882689.12908: running the handler 30529 1726882689.12952: handler run complete 30529 1726882689.12972: attempt loop complete, returning result 30529 1726882689.13034: _execute() done 30529 1726882689.13039: dumping result to json 30529 1726882689.13041: done dumping result, returning 30529 1726882689.13044: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-0000000021a4] 30529 1726882689.13046: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a4 30529 1726882689.13120: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a4 ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882689.13278: no more pending results, returning what we have 30529 1726882689.13282: results queue empty 30529 1726882689.13283: checking for any_errors_fatal 30529 1726882689.13303: done checking for any_errors_fatal 30529 1726882689.13304: checking for max_fail_percentage 30529 1726882689.13306: done checking for max_fail_percentage 30529 1726882689.13307: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.13309: done checking to see if all hosts have failed 30529 1726882689.13310: getting the remaining hosts for this loop 30529 1726882689.13312: done getting the remaining hosts for this loop 30529 1726882689.13316: getting the next task for host managed_node1 30529 1726882689.13327: done getting next task for host managed_node1 30529 1726882689.13330: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882689.13335: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.13350: getting variables 30529 1726882689.13352: in VariableManager get_vars() 30529 1726882689.13611: Calling all_inventory to load vars for managed_node1 30529 1726882689.13614: Calling groups_inventory to load vars for managed_node1 30529 1726882689.13617: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.13624: WORKER PROCESS EXITING 30529 1726882689.13633: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.13636: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.13639: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.23808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.28181: done with get_vars() 30529 1726882689.28325: done getting variables 30529 1726882689.28491: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:09 -0400 (0:00:00.183) 0:01:43.311 ****** 30529 1726882689.28531: entering _queue_task() for managed_node1/fail 30529 1726882689.29365: worker is 1 (out of 1 available) 30529 1726882689.29378: exiting _queue_task() for managed_node1/fail 30529 1726882689.29395: done queuing things up, now waiting for results queue to drain 30529 1726882689.29398: waiting for pending results... 30529 1726882689.29938: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882689.30502: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a5 30529 1726882689.30506: variable 'ansible_search_path' from source: unknown 30529 1726882689.30509: variable 'ansible_search_path' from source: unknown 30529 1726882689.30518: calling self._execute() 30529 1726882689.30738: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.30757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.30774: variable 'omit' from source: magic vars 30529 1726882689.31802: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.31807: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.32236: variable 'network_state' from source: role '' defaults 30529 1726882689.32240: Evaluated conditional (network_state != {}): False 30529 1726882689.32243: when evaluation is False, skipping this task 30529 1726882689.32246: _execute() done 30529 1726882689.32248: dumping result to json 30529 1726882689.32250: done dumping result, returning 30529 1726882689.32254: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-0000000021a5] 30529 1726882689.32257: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a5 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882689.32623: no more pending results, returning what we have 30529 1726882689.32627: results queue empty 30529 1726882689.32628: checking for any_errors_fatal 30529 1726882689.32636: done checking for any_errors_fatal 30529 1726882689.32637: checking for max_fail_percentage 30529 1726882689.32639: done checking for max_fail_percentage 30529 1726882689.32640: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.32641: done checking to see if all hosts have failed 30529 1726882689.32641: getting the remaining hosts for this loop 30529 1726882689.32643: done getting the remaining hosts for this loop 30529 1726882689.32647: getting the next task for host managed_node1 30529 1726882689.32657: done getting next task for host managed_node1 30529 1726882689.32661: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882689.32667: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.32698: getting variables 30529 1726882689.32703: in VariableManager get_vars() 30529 1726882689.32755: Calling all_inventory to load vars for managed_node1 30529 1726882689.32759: Calling groups_inventory to load vars for managed_node1 30529 1726882689.32761: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.32775: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.32778: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.32781: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.33512: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a5 30529 1726882689.33516: WORKER PROCESS EXITING 30529 1726882689.36239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.39696: done with get_vars() 30529 1726882689.39722: done getting variables 30529 1726882689.39805: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:09 -0400 (0:00:00.113) 0:01:43.424 ****** 30529 1726882689.39843: entering _queue_task() for managed_node1/fail 30529 1726882689.40338: worker is 1 (out of 1 available) 30529 1726882689.40349: exiting _queue_task() for managed_node1/fail 30529 1726882689.40360: done queuing things up, now waiting for results queue to drain 30529 1726882689.40361: waiting for pending results... 30529 1726882689.40572: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882689.40749: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a6 30529 1726882689.40768: variable 'ansible_search_path' from source: unknown 30529 1726882689.40776: variable 'ansible_search_path' from source: unknown 30529 1726882689.40824: calling self._execute() 30529 1726882689.40944: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.40961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.40975: variable 'omit' from source: magic vars 30529 1726882689.41455: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.41459: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.41586: variable 'network_state' from source: role '' defaults 30529 1726882689.41612: Evaluated conditional (network_state != {}): False 30529 1726882689.41681: when evaluation is False, skipping this task 30529 1726882689.41684: _execute() done 30529 1726882689.41686: dumping result to json 30529 1726882689.41692: done dumping result, returning 30529 1726882689.41696: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-0000000021a6] 30529 1726882689.41699: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a6 30529 1726882689.41779: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a6 30529 1726882689.41782: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882689.41838: no more pending results, returning what we have 30529 1726882689.41842: results queue empty 30529 1726882689.41843: checking for any_errors_fatal 30529 1726882689.41851: done checking for any_errors_fatal 30529 1726882689.41852: checking for max_fail_percentage 30529 1726882689.41854: done checking for max_fail_percentage 30529 1726882689.41855: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.41856: done checking to see if all hosts have failed 30529 1726882689.41857: getting the remaining hosts for this loop 30529 1726882689.41858: done getting the remaining hosts for this loop 30529 1726882689.41862: getting the next task for host managed_node1 30529 1726882689.41872: done getting next task for host managed_node1 30529 1726882689.41876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882689.41882: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.41912: getting variables 30529 1726882689.41914: in VariableManager get_vars() 30529 1726882689.42077: Calling all_inventory to load vars for managed_node1 30529 1726882689.42080: Calling groups_inventory to load vars for managed_node1 30529 1726882689.42082: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.42157: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.42162: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.42166: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.43741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.45411: done with get_vars() 30529 1726882689.45436: done getting variables 30529 1726882689.45507: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:09 -0400 (0:00:00.056) 0:01:43.481 ****** 30529 1726882689.45545: entering _queue_task() for managed_node1/fail 30529 1726882689.45914: worker is 1 (out of 1 available) 30529 1726882689.46006: exiting _queue_task() for managed_node1/fail 30529 1726882689.46018: done queuing things up, now waiting for results queue to drain 30529 1726882689.46020: waiting for pending results... 30529 1726882689.46314: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882689.46408: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a7 30529 1726882689.46433: variable 'ansible_search_path' from source: unknown 30529 1726882689.46440: variable 'ansible_search_path' from source: unknown 30529 1726882689.46477: calling self._execute() 30529 1726882689.46628: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.46636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.46639: variable 'omit' from source: magic vars 30529 1726882689.47042: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.47065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.47254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882689.49896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882689.50324: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882689.50378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882689.50433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882689.50499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882689.50554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.50590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.50626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.50698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.50701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.50798: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.50821: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882689.50945: variable 'ansible_distribution' from source: facts 30529 1726882689.50978: variable '__network_rh_distros' from source: role '' defaults 30529 1726882689.50981: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882689.51227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.51261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.51305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.51377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.51380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.51431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.51456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.51484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.51597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.51600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.51602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.51618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.51651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.51699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.51719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.52035: variable 'network_connections' from source: include params 30529 1726882689.52051: variable 'interface' from source: play vars 30529 1726882689.52129: variable 'interface' from source: play vars 30529 1726882689.52172: variable 'network_state' from source: role '' defaults 30529 1726882689.52224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882689.52397: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882689.52445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882689.52498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882689.52546: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882689.52798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882689.52801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882689.52811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.52813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882689.52816: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882689.52818: when evaluation is False, skipping this task 30529 1726882689.52820: _execute() done 30529 1726882689.52822: dumping result to json 30529 1726882689.52824: done dumping result, returning 30529 1726882689.52827: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-0000000021a7] 30529 1726882689.52829: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a7 30529 1726882689.52901: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a7 30529 1726882689.52905: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882689.52955: no more pending results, returning what we have 30529 1726882689.52959: results queue empty 30529 1726882689.52960: checking for any_errors_fatal 30529 1726882689.52966: done checking for any_errors_fatal 30529 1726882689.52967: checking for max_fail_percentage 30529 1726882689.52969: done checking for max_fail_percentage 30529 1726882689.52970: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.52971: done checking to see if all hosts have failed 30529 1726882689.52971: getting the remaining hosts for this loop 30529 1726882689.52973: done getting the remaining hosts for this loop 30529 1726882689.52977: getting the next task for host managed_node1 30529 1726882689.52986: done getting next task for host managed_node1 30529 1726882689.52995: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882689.53000: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.53021: getting variables 30529 1726882689.53023: in VariableManager get_vars() 30529 1726882689.53067: Calling all_inventory to load vars for managed_node1 30529 1726882689.53069: Calling groups_inventory to load vars for managed_node1 30529 1726882689.53071: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.53081: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.53084: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.53086: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.55041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.56727: done with get_vars() 30529 1726882689.56771: done getting variables 30529 1726882689.56837: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:09 -0400 (0:00:00.113) 0:01:43.594 ****** 30529 1726882689.56873: entering _queue_task() for managed_node1/dnf 30529 1726882689.57353: worker is 1 (out of 1 available) 30529 1726882689.57369: exiting _queue_task() for managed_node1/dnf 30529 1726882689.57383: done queuing things up, now waiting for results queue to drain 30529 1726882689.57385: waiting for pending results... 30529 1726882689.57711: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882689.57876: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a8 30529 1726882689.57880: variable 'ansible_search_path' from source: unknown 30529 1726882689.57882: variable 'ansible_search_path' from source: unknown 30529 1726882689.57885: calling self._execute() 30529 1726882689.57982: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.57998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.58011: variable 'omit' from source: magic vars 30529 1726882689.58391: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.58418: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.58635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882689.60999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882689.61134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882689.61137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882689.61175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882689.61210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882689.61291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.61325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.61356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.61502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.61505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.61635: variable 'ansible_distribution' from source: facts 30529 1726882689.61638: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.61656: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882689.61798: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882689.61968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.62044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.62048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.62085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.62102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.62153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.62173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.62197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.62243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.62262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.62299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.62322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.62371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.62395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.62479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.62595: variable 'network_connections' from source: include params 30529 1726882689.62608: variable 'interface' from source: play vars 30529 1726882689.62999: variable 'interface' from source: play vars 30529 1726882689.63068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882689.63241: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882689.63282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882689.63315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882689.63349: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882689.63385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882689.63408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882689.63458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.63461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882689.63511: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882689.63745: variable 'network_connections' from source: include params 30529 1726882689.63748: variable 'interface' from source: play vars 30529 1726882689.63898: variable 'interface' from source: play vars 30529 1726882689.63902: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882689.63904: when evaluation is False, skipping this task 30529 1726882689.63906: _execute() done 30529 1726882689.63908: dumping result to json 30529 1726882689.63910: done dumping result, returning 30529 1726882689.63912: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000021a8] 30529 1726882689.63913: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a8 30529 1726882689.63977: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a8 30529 1726882689.63979: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882689.64033: no more pending results, returning what we have 30529 1726882689.64037: results queue empty 30529 1726882689.64038: checking for any_errors_fatal 30529 1726882689.64047: done checking for any_errors_fatal 30529 1726882689.64047: checking for max_fail_percentage 30529 1726882689.64049: done checking for max_fail_percentage 30529 1726882689.64050: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.64051: done checking to see if all hosts have failed 30529 1726882689.64051: getting the remaining hosts for this loop 30529 1726882689.64053: done getting the remaining hosts for this loop 30529 1726882689.64057: getting the next task for host managed_node1 30529 1726882689.64064: done getting next task for host managed_node1 30529 1726882689.64068: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882689.64073: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.64098: getting variables 30529 1726882689.64100: in VariableManager get_vars() 30529 1726882689.64143: Calling all_inventory to load vars for managed_node1 30529 1726882689.64145: Calling groups_inventory to load vars for managed_node1 30529 1726882689.64147: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.64156: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.64159: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.64161: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.65619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.67174: done with get_vars() 30529 1726882689.67202: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882689.67282: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:09 -0400 (0:00:00.104) 0:01:43.699 ****** 30529 1726882689.67341: entering _queue_task() for managed_node1/yum 30529 1726882689.67725: worker is 1 (out of 1 available) 30529 1726882689.67740: exiting _queue_task() for managed_node1/yum 30529 1726882689.67755: done queuing things up, now waiting for results queue to drain 30529 1726882689.67757: waiting for pending results... 30529 1726882689.68219: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882689.68284: in run() - task 12673a56-9f93-b0f1-edc0-0000000021a9 30529 1726882689.68320: variable 'ansible_search_path' from source: unknown 30529 1726882689.68324: variable 'ansible_search_path' from source: unknown 30529 1726882689.68339: calling self._execute() 30529 1726882689.68899: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.68903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.68906: variable 'omit' from source: magic vars 30529 1726882689.69198: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.69219: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.69504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882689.72055: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882689.72122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882689.72230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882689.72248: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882689.72390: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882689.72396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.72438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.72471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.72529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.72548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.72654: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.72674: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882689.72681: when evaluation is False, skipping this task 30529 1726882689.72733: _execute() done 30529 1726882689.72736: dumping result to json 30529 1726882689.72739: done dumping result, returning 30529 1726882689.72741: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000021a9] 30529 1726882689.72744: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a9 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882689.72885: no more pending results, returning what we have 30529 1726882689.72889: results queue empty 30529 1726882689.72890: checking for any_errors_fatal 30529 1726882689.72900: done checking for any_errors_fatal 30529 1726882689.72901: checking for max_fail_percentage 30529 1726882689.72903: done checking for max_fail_percentage 30529 1726882689.72904: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.72905: done checking to see if all hosts have failed 30529 1726882689.72906: getting the remaining hosts for this loop 30529 1726882689.72908: done getting the remaining hosts for this loop 30529 1726882689.72912: getting the next task for host managed_node1 30529 1726882689.72922: done getting next task for host managed_node1 30529 1726882689.72926: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882689.72932: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.72957: getting variables 30529 1726882689.72959: in VariableManager get_vars() 30529 1726882689.73123: Calling all_inventory to load vars for managed_node1 30529 1726882689.73126: Calling groups_inventory to load vars for managed_node1 30529 1726882689.73129: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.73140: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.73144: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.73147: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.73671: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021a9 30529 1726882689.73675: WORKER PROCESS EXITING 30529 1726882689.74256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.75123: done with get_vars() 30529 1726882689.75138: done getting variables 30529 1726882689.75181: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:09 -0400 (0:00:00.078) 0:01:43.778 ****** 30529 1726882689.75211: entering _queue_task() for managed_node1/fail 30529 1726882689.75568: worker is 1 (out of 1 available) 30529 1726882689.75582: exiting _queue_task() for managed_node1/fail 30529 1726882689.75597: done queuing things up, now waiting for results queue to drain 30529 1726882689.75599: waiting for pending results... 30529 1726882689.75951: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882689.76026: in run() - task 12673a56-9f93-b0f1-edc0-0000000021aa 30529 1726882689.76043: variable 'ansible_search_path' from source: unknown 30529 1726882689.76051: variable 'ansible_search_path' from source: unknown 30529 1726882689.76179: calling self._execute() 30529 1726882689.76183: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.76186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.76197: variable 'omit' from source: magic vars 30529 1726882689.76583: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.76634: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.76741: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882689.76947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882689.79269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882689.79347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882689.79466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882689.79469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882689.79472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882689.79539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.79573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.79602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.79642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.79656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.79711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.79734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.79759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.79805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.79821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.79859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.79900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.79914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.79998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.80003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.80198: variable 'network_connections' from source: include params 30529 1726882689.80201: variable 'interface' from source: play vars 30529 1726882689.80235: variable 'interface' from source: play vars 30529 1726882689.80308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882689.80475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882689.80526: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882689.80827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882689.80830: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882689.80832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882689.80835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882689.80837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.80839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882689.80841: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882689.81399: variable 'network_connections' from source: include params 30529 1726882689.81402: variable 'interface' from source: play vars 30529 1726882689.81517: variable 'interface' from source: play vars 30529 1726882689.81552: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882689.81556: when evaluation is False, skipping this task 30529 1726882689.81559: _execute() done 30529 1726882689.81561: dumping result to json 30529 1726882689.81563: done dumping result, returning 30529 1726882689.81599: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000021aa] 30529 1726882689.81602: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021aa 30529 1726882689.81962: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021aa 30529 1726882689.81965: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882689.82025: no more pending results, returning what we have 30529 1726882689.82029: results queue empty 30529 1726882689.82030: checking for any_errors_fatal 30529 1726882689.82037: done checking for any_errors_fatal 30529 1726882689.82038: checking for max_fail_percentage 30529 1726882689.82040: done checking for max_fail_percentage 30529 1726882689.82042: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.82043: done checking to see if all hosts have failed 30529 1726882689.82043: getting the remaining hosts for this loop 30529 1726882689.82045: done getting the remaining hosts for this loop 30529 1726882689.82049: getting the next task for host managed_node1 30529 1726882689.82058: done getting next task for host managed_node1 30529 1726882689.82062: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882689.82067: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.82089: getting variables 30529 1726882689.82092: in VariableManager get_vars() 30529 1726882689.82340: Calling all_inventory to load vars for managed_node1 30529 1726882689.82343: Calling groups_inventory to load vars for managed_node1 30529 1726882689.82345: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.82360: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.82363: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.82367: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.83752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882689.85487: done with get_vars() 30529 1726882689.85511: done getting variables 30529 1726882689.85574: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:09 -0400 (0:00:00.103) 0:01:43.882 ****** 30529 1726882689.85611: entering _queue_task() for managed_node1/package 30529 1726882689.86100: worker is 1 (out of 1 available) 30529 1726882689.86112: exiting _queue_task() for managed_node1/package 30529 1726882689.86123: done queuing things up, now waiting for results queue to drain 30529 1726882689.86125: waiting for pending results... 30529 1726882689.86362: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882689.86479: in run() - task 12673a56-9f93-b0f1-edc0-0000000021ab 30529 1726882689.86518: variable 'ansible_search_path' from source: unknown 30529 1726882689.86522: variable 'ansible_search_path' from source: unknown 30529 1726882689.86553: calling self._execute() 30529 1726882689.86675: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882689.86679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882689.86685: variable 'omit' from source: magic vars 30529 1726882689.87170: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.87174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882689.87303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882689.87585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882689.87645: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882689.87684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882689.87770: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882689.87894: variable 'network_packages' from source: role '' defaults 30529 1726882689.88009: variable '__network_provider_setup' from source: role '' defaults 30529 1726882689.88026: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882689.88098: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882689.88112: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882689.88261: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882689.88382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882689.90446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882689.90511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882689.90561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882689.90601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882689.90632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882689.90733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.90775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.90810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.90856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.90884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.90934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.90978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.91000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.91042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.91088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.91312: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882689.91431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.91460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.91519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.91536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.91551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.91643: variable 'ansible_python' from source: facts 30529 1726882689.91662: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882689.91745: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882689.91845: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882689.91953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.91981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.92008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.92061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.92067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.92112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882689.92171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882689.92178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.92281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882689.92284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882689.92391: variable 'network_connections' from source: include params 30529 1726882689.92406: variable 'interface' from source: play vars 30529 1726882689.92503: variable 'interface' from source: play vars 30529 1726882689.92573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882689.92613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882689.92648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882689.92684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882689.92744: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882689.93052: variable 'network_connections' from source: include params 30529 1726882689.93199: variable 'interface' from source: play vars 30529 1726882689.93203: variable 'interface' from source: play vars 30529 1726882689.93227: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882689.93310: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882689.93648: variable 'network_connections' from source: include params 30529 1726882689.93659: variable 'interface' from source: play vars 30529 1726882689.93725: variable 'interface' from source: play vars 30529 1726882689.93758: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882689.93838: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882689.94167: variable 'network_connections' from source: include params 30529 1726882689.94177: variable 'interface' from source: play vars 30529 1726882689.94303: variable 'interface' from source: play vars 30529 1726882689.94320: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882689.94381: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882689.94397: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882689.94462: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882689.94688: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882689.95183: variable 'network_connections' from source: include params 30529 1726882689.95196: variable 'interface' from source: play vars 30529 1726882689.95258: variable 'interface' from source: play vars 30529 1726882689.95284: variable 'ansible_distribution' from source: facts 30529 1726882689.95287: variable '__network_rh_distros' from source: role '' defaults 30529 1726882689.95363: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.95366: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882689.95541: variable 'ansible_distribution' from source: facts 30529 1726882689.95551: variable '__network_rh_distros' from source: role '' defaults 30529 1726882689.95562: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.95575: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882689.95757: variable 'ansible_distribution' from source: facts 30529 1726882689.95768: variable '__network_rh_distros' from source: role '' defaults 30529 1726882689.95777: variable 'ansible_distribution_major_version' from source: facts 30529 1726882689.95817: variable 'network_provider' from source: set_fact 30529 1726882689.96102: variable 'ansible_facts' from source: unknown 30529 1726882689.97399: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882689.97402: when evaluation is False, skipping this task 30529 1726882689.97405: _execute() done 30529 1726882689.97407: dumping result to json 30529 1726882689.97409: done dumping result, returning 30529 1726882689.97412: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-0000000021ab] 30529 1726882689.97414: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ab 30529 1726882689.97504: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ab 30529 1726882689.97507: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882689.97563: no more pending results, returning what we have 30529 1726882689.97567: results queue empty 30529 1726882689.97568: checking for any_errors_fatal 30529 1726882689.97579: done checking for any_errors_fatal 30529 1726882689.97580: checking for max_fail_percentage 30529 1726882689.97582: done checking for max_fail_percentage 30529 1726882689.97583: checking to see if all hosts have failed and the running result is not ok 30529 1726882689.97584: done checking to see if all hosts have failed 30529 1726882689.97585: getting the remaining hosts for this loop 30529 1726882689.97587: done getting the remaining hosts for this loop 30529 1726882689.97591: getting the next task for host managed_node1 30529 1726882689.97602: done getting next task for host managed_node1 30529 1726882689.97607: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882689.97612: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882689.97637: getting variables 30529 1726882689.97639: in VariableManager get_vars() 30529 1726882689.97691: Calling all_inventory to load vars for managed_node1 30529 1726882689.97800: Calling groups_inventory to load vars for managed_node1 30529 1726882689.97809: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882689.97819: Calling all_plugins_play to load vars for managed_node1 30529 1726882689.97823: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882689.97826: Calling groups_plugins_play to load vars for managed_node1 30529 1726882689.99935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.01264: done with get_vars() 30529 1726882690.01282: done getting variables 30529 1726882690.01342: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:10 -0400 (0:00:00.157) 0:01:44.039 ****** 30529 1726882690.01380: entering _queue_task() for managed_node1/package 30529 1726882690.01730: worker is 1 (out of 1 available) 30529 1726882690.01743: exiting _queue_task() for managed_node1/package 30529 1726882690.01755: done queuing things up, now waiting for results queue to drain 30529 1726882690.01756: waiting for pending results... 30529 1726882690.02144: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882690.02216: in run() - task 12673a56-9f93-b0f1-edc0-0000000021ac 30529 1726882690.02247: variable 'ansible_search_path' from source: unknown 30529 1726882690.02250: variable 'ansible_search_path' from source: unknown 30529 1726882690.02283: calling self._execute() 30529 1726882690.02375: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.02379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.02388: variable 'omit' from source: magic vars 30529 1726882690.02718: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.02811: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.02998: variable 'network_state' from source: role '' defaults 30529 1726882690.03002: Evaluated conditional (network_state != {}): False 30529 1726882690.03004: when evaluation is False, skipping this task 30529 1726882690.03005: _execute() done 30529 1726882690.03007: dumping result to json 30529 1726882690.03009: done dumping result, returning 30529 1726882690.03011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000021ac] 30529 1726882690.03013: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ac 30529 1726882690.03077: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ac 30529 1726882690.03080: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882690.03124: no more pending results, returning what we have 30529 1726882690.03127: results queue empty 30529 1726882690.03128: checking for any_errors_fatal 30529 1726882690.03132: done checking for any_errors_fatal 30529 1726882690.03133: checking for max_fail_percentage 30529 1726882690.03134: done checking for max_fail_percentage 30529 1726882690.03135: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.03136: done checking to see if all hosts have failed 30529 1726882690.03137: getting the remaining hosts for this loop 30529 1726882690.03138: done getting the remaining hosts for this loop 30529 1726882690.03141: getting the next task for host managed_node1 30529 1726882690.03149: done getting next task for host managed_node1 30529 1726882690.03152: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882690.03157: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.03174: getting variables 30529 1726882690.03175: in VariableManager get_vars() 30529 1726882690.03212: Calling all_inventory to load vars for managed_node1 30529 1726882690.03215: Calling groups_inventory to load vars for managed_node1 30529 1726882690.03218: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.03226: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.03229: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.03232: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.04442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.05940: done with get_vars() 30529 1726882690.05963: done getting variables 30529 1726882690.06024: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:10 -0400 (0:00:00.046) 0:01:44.086 ****** 30529 1726882690.06060: entering _queue_task() for managed_node1/package 30529 1726882690.06364: worker is 1 (out of 1 available) 30529 1726882690.06376: exiting _queue_task() for managed_node1/package 30529 1726882690.06388: done queuing things up, now waiting for results queue to drain 30529 1726882690.06390: waiting for pending results... 30529 1726882690.06811: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882690.06834: in run() - task 12673a56-9f93-b0f1-edc0-0000000021ad 30529 1726882690.06854: variable 'ansible_search_path' from source: unknown 30529 1726882690.06862: variable 'ansible_search_path' from source: unknown 30529 1726882690.06907: calling self._execute() 30529 1726882690.07006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.07022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.07038: variable 'omit' from source: magic vars 30529 1726882690.07415: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.07431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.07559: variable 'network_state' from source: role '' defaults 30529 1726882690.07575: Evaluated conditional (network_state != {}): False 30529 1726882690.07668: when evaluation is False, skipping this task 30529 1726882690.07671: _execute() done 30529 1726882690.07673: dumping result to json 30529 1726882690.07676: done dumping result, returning 30529 1726882690.07678: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000021ad] 30529 1726882690.07681: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ad 30529 1726882690.07751: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ad 30529 1726882690.07754: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882690.07819: no more pending results, returning what we have 30529 1726882690.07823: results queue empty 30529 1726882690.07825: checking for any_errors_fatal 30529 1726882690.07834: done checking for any_errors_fatal 30529 1726882690.07834: checking for max_fail_percentage 30529 1726882690.07837: done checking for max_fail_percentage 30529 1726882690.07838: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.07839: done checking to see if all hosts have failed 30529 1726882690.07840: getting the remaining hosts for this loop 30529 1726882690.07841: done getting the remaining hosts for this loop 30529 1726882690.07846: getting the next task for host managed_node1 30529 1726882690.07855: done getting next task for host managed_node1 30529 1726882690.07860: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882690.07866: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.07890: getting variables 30529 1726882690.07894: in VariableManager get_vars() 30529 1726882690.07940: Calling all_inventory to load vars for managed_node1 30529 1726882690.07943: Calling groups_inventory to load vars for managed_node1 30529 1726882690.07946: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.07959: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.07962: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.07965: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.09498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.11037: done with get_vars() 30529 1726882690.11058: done getting variables 30529 1726882690.11117: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:10 -0400 (0:00:00.050) 0:01:44.137 ****** 30529 1726882690.11154: entering _queue_task() for managed_node1/service 30529 1726882690.11445: worker is 1 (out of 1 available) 30529 1726882690.11458: exiting _queue_task() for managed_node1/service 30529 1726882690.11471: done queuing things up, now waiting for results queue to drain 30529 1726882690.11472: waiting for pending results... 30529 1726882690.11820: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882690.11919: in run() - task 12673a56-9f93-b0f1-edc0-0000000021ae 30529 1726882690.11940: variable 'ansible_search_path' from source: unknown 30529 1726882690.11949: variable 'ansible_search_path' from source: unknown 30529 1726882690.11988: calling self._execute() 30529 1726882690.12086: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.12134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.12138: variable 'omit' from source: magic vars 30529 1726882690.12487: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.12507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.12634: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882690.12999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882690.15357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882690.15429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882690.15482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882690.15527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882690.15559: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882690.15650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.15685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.15722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.15768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.15788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.15848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.15876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.15906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.15954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.15972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.16016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.16047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.16075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.16117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.16138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.16323: variable 'network_connections' from source: include params 30529 1726882690.16367: variable 'interface' from source: play vars 30529 1726882690.16421: variable 'interface' from source: play vars 30529 1726882690.16504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882690.16681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882690.16731: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882690.16800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882690.16811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882690.16854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882690.16880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882690.16915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.16944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882690.17016: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882690.17298: variable 'network_connections' from source: include params 30529 1726882690.17301: variable 'interface' from source: play vars 30529 1726882690.17328: variable 'interface' from source: play vars 30529 1726882690.17368: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882690.17376: when evaluation is False, skipping this task 30529 1726882690.17382: _execute() done 30529 1726882690.17389: dumping result to json 30529 1726882690.17399: done dumping result, returning 30529 1726882690.17411: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000021ae] 30529 1726882690.17420: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ae skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882690.17648: no more pending results, returning what we have 30529 1726882690.17652: results queue empty 30529 1726882690.17653: checking for any_errors_fatal 30529 1726882690.17659: done checking for any_errors_fatal 30529 1726882690.17660: checking for max_fail_percentage 30529 1726882690.17662: done checking for max_fail_percentage 30529 1726882690.17663: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.17664: done checking to see if all hosts have failed 30529 1726882690.17665: getting the remaining hosts for this loop 30529 1726882690.17666: done getting the remaining hosts for this loop 30529 1726882690.17670: getting the next task for host managed_node1 30529 1726882690.17679: done getting next task for host managed_node1 30529 1726882690.17683: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882690.17687: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.17711: getting variables 30529 1726882690.17713: in VariableManager get_vars() 30529 1726882690.17759: Calling all_inventory to load vars for managed_node1 30529 1726882690.17762: Calling groups_inventory to load vars for managed_node1 30529 1726882690.17765: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.17776: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.17780: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.17783: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.18506: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021ae 30529 1726882690.18510: WORKER PROCESS EXITING 30529 1726882690.19421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.20506: done with get_vars() 30529 1726882690.20527: done getting variables 30529 1726882690.20587: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:10 -0400 (0:00:00.094) 0:01:44.232 ****** 30529 1726882690.20626: entering _queue_task() for managed_node1/service 30529 1726882690.20953: worker is 1 (out of 1 available) 30529 1726882690.20966: exiting _queue_task() for managed_node1/service 30529 1726882690.20979: done queuing things up, now waiting for results queue to drain 30529 1726882690.20981: waiting for pending results... 30529 1726882690.21269: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882690.21411: in run() - task 12673a56-9f93-b0f1-edc0-0000000021af 30529 1726882690.21437: variable 'ansible_search_path' from source: unknown 30529 1726882690.21448: variable 'ansible_search_path' from source: unknown 30529 1726882690.21509: calling self._execute() 30529 1726882690.21616: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.21630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.21655: variable 'omit' from source: magic vars 30529 1726882690.22012: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.22030: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.22180: variable 'network_provider' from source: set_fact 30529 1726882690.22191: variable 'network_state' from source: role '' defaults 30529 1726882690.22208: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882690.22220: variable 'omit' from source: magic vars 30529 1726882690.22411: variable 'omit' from source: magic vars 30529 1726882690.22415: variable 'network_service_name' from source: role '' defaults 30529 1726882690.22418: variable 'network_service_name' from source: role '' defaults 30529 1726882690.22528: variable '__network_provider_setup' from source: role '' defaults 30529 1726882690.22534: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882690.22602: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882690.22611: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882690.22675: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882690.22987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882690.26387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882690.26475: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882690.26629: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882690.26700: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882690.26703: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882690.26781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.26823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.26863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.26909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.27002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.27005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.27008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.27032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.27073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.27090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.27338: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882690.27456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.27484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.27517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.27564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.27582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.27680: variable 'ansible_python' from source: facts 30529 1726882690.27706: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882690.27797: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882690.27881: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882690.28012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.28090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.28095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.28115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.28132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.28180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.28222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.28249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.28290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.28399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.28456: variable 'network_connections' from source: include params 30529 1726882690.28468: variable 'interface' from source: play vars 30529 1726882690.28547: variable 'interface' from source: play vars 30529 1726882690.28656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882690.28858: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882690.28912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882690.28961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882690.29005: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882690.29069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882690.29106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882690.29141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.29181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882690.29236: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882690.29525: variable 'network_connections' from source: include params 30529 1726882690.29536: variable 'interface' from source: play vars 30529 1726882690.29720: variable 'interface' from source: play vars 30529 1726882690.29724: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882690.29744: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882690.30034: variable 'network_connections' from source: include params 30529 1726882690.30047: variable 'interface' from source: play vars 30529 1726882690.30109: variable 'interface' from source: play vars 30529 1726882690.30134: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882690.30208: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882690.30508: variable 'network_connections' from source: include params 30529 1726882690.30518: variable 'interface' from source: play vars 30529 1726882690.30595: variable 'interface' from source: play vars 30529 1726882690.30659: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882690.30725: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882690.30737: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882690.30805: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882690.31035: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882690.31555: variable 'network_connections' from source: include params 30529 1726882690.31566: variable 'interface' from source: play vars 30529 1726882690.31632: variable 'interface' from source: play vars 30529 1726882690.31647: variable 'ansible_distribution' from source: facts 30529 1726882690.31655: variable '__network_rh_distros' from source: role '' defaults 30529 1726882690.31684: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.31704: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882690.31878: variable 'ansible_distribution' from source: facts 30529 1726882690.31899: variable '__network_rh_distros' from source: role '' defaults 30529 1726882690.31902: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.31913: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882690.32116: variable 'ansible_distribution' from source: facts 30529 1726882690.32120: variable '__network_rh_distros' from source: role '' defaults 30529 1726882690.32122: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.32145: variable 'network_provider' from source: set_fact 30529 1726882690.32173: variable 'omit' from source: magic vars 30529 1726882690.32206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882690.32402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882690.32405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882690.32408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882690.32410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882690.32413: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882690.32416: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.32418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.32431: Set connection var ansible_shell_executable to /bin/sh 30529 1726882690.32437: Set connection var ansible_pipelining to False 30529 1726882690.32440: Set connection var ansible_shell_type to sh 30529 1726882690.32451: Set connection var ansible_timeout to 10 30529 1726882690.32454: Set connection var ansible_connection to ssh 30529 1726882690.32456: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882690.32482: variable 'ansible_shell_executable' from source: unknown 30529 1726882690.32485: variable 'ansible_connection' from source: unknown 30529 1726882690.32490: variable 'ansible_module_compression' from source: unknown 30529 1726882690.32494: variable 'ansible_shell_type' from source: unknown 30529 1726882690.32496: variable 'ansible_shell_executable' from source: unknown 30529 1726882690.32499: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.32501: variable 'ansible_pipelining' from source: unknown 30529 1726882690.32503: variable 'ansible_timeout' from source: unknown 30529 1726882690.32573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.32611: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882690.32629: variable 'omit' from source: magic vars 30529 1726882690.32634: starting attempt loop 30529 1726882690.32637: running the handler 30529 1726882690.32712: variable 'ansible_facts' from source: unknown 30529 1726882690.33433: _low_level_execute_command(): starting 30529 1726882690.33445: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882690.33898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.33930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882690.33934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882690.33936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.33938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.33940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.33999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882690.34002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882690.34004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.34049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.35750: stdout chunk (state=3): >>>/root <<< 30529 1726882690.35856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882690.35886: stderr chunk (state=3): >>><<< 30529 1726882690.35888: stdout chunk (state=3): >>><<< 30529 1726882690.35909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882690.35920: _low_level_execute_command(): starting 30529 1726882690.35926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189 `" && echo ansible-tmp-1726882690.3590848-35420-185622195029189="` echo /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189 `" ) && sleep 0' 30529 1726882690.36334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882690.36337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882690.36339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882690.36341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882690.36343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.36401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882690.36404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.36439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.38308: stdout chunk (state=3): >>>ansible-tmp-1726882690.3590848-35420-185622195029189=/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189 <<< 30529 1726882690.38468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882690.38472: stdout chunk (state=3): >>><<< 30529 1726882690.38474: stderr chunk (state=3): >>><<< 30529 1726882690.38626: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882690.3590848-35420-185622195029189=/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882690.38631: variable 'ansible_module_compression' from source: unknown 30529 1726882690.38633: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882690.38657: variable 'ansible_facts' from source: unknown 30529 1726882690.38805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py 30529 1726882690.38899: Sending initial data 30529 1726882690.38902: Sent initial data (156 bytes) 30529 1726882690.39325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882690.39328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882690.39334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.39336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.39338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882690.39340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.39374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882690.39398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.39431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.41004: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882690.41049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882690.41104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_2jy2lj3 /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py <<< 30529 1726882690.41108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py" <<< 30529 1726882690.41127: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_2jy2lj3" to remote "/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py" <<< 30529 1726882690.42798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882690.42801: stdout chunk (state=3): >>><<< 30529 1726882690.42803: stderr chunk (state=3): >>><<< 30529 1726882690.42824: done transferring module to remote 30529 1726882690.42836: _low_level_execute_command(): starting 30529 1726882690.42841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/ /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py && sleep 0' 30529 1726882690.43333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882690.43375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882690.43378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.43380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882690.43383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.43386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.43429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882690.43442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.43487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.45213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882690.45244: stderr chunk (state=3): >>><<< 30529 1726882690.45247: stdout chunk (state=3): >>><<< 30529 1726882690.45258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882690.45260: _low_level_execute_command(): starting 30529 1726882690.45266: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/AnsiballZ_systemd.py && sleep 0' 30529 1726882690.45664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.45748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882690.45752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.45754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882690.45757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882690.45759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.45781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882690.45795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.45868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.74573: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10764288", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314221056", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1933496000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882690.76312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882690.76332: stderr chunk (state=3): >>><<< 30529 1726882690.76335: stdout chunk (state=3): >>><<< 30529 1726882690.76350: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10764288", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314221056", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1933496000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882690.76477: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882690.76497: _low_level_execute_command(): starting 30529 1726882690.76502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882690.3590848-35420-185622195029189/ > /dev/null 2>&1 && sleep 0' 30529 1726882690.76938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882690.76941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882690.76943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.76945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882690.76947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882690.76999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882690.77003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882690.77019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882690.77049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882690.78833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882690.78856: stderr chunk (state=3): >>><<< 30529 1726882690.78859: stdout chunk (state=3): >>><<< 30529 1726882690.78869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882690.78876: handler run complete 30529 1726882690.78921: attempt loop complete, returning result 30529 1726882690.78924: _execute() done 30529 1726882690.78926: dumping result to json 30529 1726882690.78937: done dumping result, returning 30529 1726882690.78945: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-0000000021af] 30529 1726882690.78948: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021af 30529 1726882690.79294: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021af 30529 1726882690.79297: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882690.79352: no more pending results, returning what we have 30529 1726882690.79355: results queue empty 30529 1726882690.79356: checking for any_errors_fatal 30529 1726882690.79361: done checking for any_errors_fatal 30529 1726882690.79362: checking for max_fail_percentage 30529 1726882690.79363: done checking for max_fail_percentage 30529 1726882690.79364: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.79365: done checking to see if all hosts have failed 30529 1726882690.79366: getting the remaining hosts for this loop 30529 1726882690.79368: done getting the remaining hosts for this loop 30529 1726882690.79371: getting the next task for host managed_node1 30529 1726882690.79378: done getting next task for host managed_node1 30529 1726882690.79381: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882690.79385: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.79399: getting variables 30529 1726882690.79401: in VariableManager get_vars() 30529 1726882690.79439: Calling all_inventory to load vars for managed_node1 30529 1726882690.79441: Calling groups_inventory to load vars for managed_node1 30529 1726882690.79444: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.79452: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.79455: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.79457: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.80283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.81268: done with get_vars() 30529 1726882690.81285: done getting variables 30529 1726882690.81333: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:10 -0400 (0:00:00.607) 0:01:44.839 ****** 30529 1726882690.81364: entering _queue_task() for managed_node1/service 30529 1726882690.81629: worker is 1 (out of 1 available) 30529 1726882690.81641: exiting _queue_task() for managed_node1/service 30529 1726882690.81654: done queuing things up, now waiting for results queue to drain 30529 1726882690.81655: waiting for pending results... 30529 1726882690.81851: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882690.81938: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b0 30529 1726882690.81951: variable 'ansible_search_path' from source: unknown 30529 1726882690.81955: variable 'ansible_search_path' from source: unknown 30529 1726882690.81982: calling self._execute() 30529 1726882690.82063: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.82067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.82075: variable 'omit' from source: magic vars 30529 1726882690.82374: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.82384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.82470: variable 'network_provider' from source: set_fact 30529 1726882690.82473: Evaluated conditional (network_provider == "nm"): True 30529 1726882690.82539: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882690.82602: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882690.82720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882690.84185: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882690.84232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882690.84260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882690.84287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882690.84312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882690.84380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.84407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.84425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.84451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.84462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.84502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.84518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.84535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.84559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.84569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.84602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.84621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.84637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.84660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.84670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.84765: variable 'network_connections' from source: include params 30529 1726882690.84773: variable 'interface' from source: play vars 30529 1726882690.84824: variable 'interface' from source: play vars 30529 1726882690.84873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882690.84983: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882690.85016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882690.85038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882690.85062: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882690.85090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882690.85110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882690.85127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.85144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882690.85185: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882690.85339: variable 'network_connections' from source: include params 30529 1726882690.85342: variable 'interface' from source: play vars 30529 1726882690.85386: variable 'interface' from source: play vars 30529 1726882690.85420: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882690.85423: when evaluation is False, skipping this task 30529 1726882690.85426: _execute() done 30529 1726882690.85428: dumping result to json 30529 1726882690.85430: done dumping result, returning 30529 1726882690.85436: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-0000000021b0] 30529 1726882690.85447: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b0 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882690.85575: no more pending results, returning what we have 30529 1726882690.85579: results queue empty 30529 1726882690.85580: checking for any_errors_fatal 30529 1726882690.85604: done checking for any_errors_fatal 30529 1726882690.85604: checking for max_fail_percentage 30529 1726882690.85606: done checking for max_fail_percentage 30529 1726882690.85607: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.85608: done checking to see if all hosts have failed 30529 1726882690.85609: getting the remaining hosts for this loop 30529 1726882690.85611: done getting the remaining hosts for this loop 30529 1726882690.85615: getting the next task for host managed_node1 30529 1726882690.85623: done getting next task for host managed_node1 30529 1726882690.85627: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882690.85632: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.85651: getting variables 30529 1726882690.85653: in VariableManager get_vars() 30529 1726882690.85701: Calling all_inventory to load vars for managed_node1 30529 1726882690.85703: Calling groups_inventory to load vars for managed_node1 30529 1726882690.85706: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.85711: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b0 30529 1726882690.85713: WORKER PROCESS EXITING 30529 1726882690.85721: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.85724: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.85726: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.86527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.87398: done with get_vars() 30529 1726882690.87414: done getting variables 30529 1726882690.87456: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:10 -0400 (0:00:00.061) 0:01:44.900 ****** 30529 1726882690.87480: entering _queue_task() for managed_node1/service 30529 1726882690.87716: worker is 1 (out of 1 available) 30529 1726882690.87729: exiting _queue_task() for managed_node1/service 30529 1726882690.87743: done queuing things up, now waiting for results queue to drain 30529 1726882690.87744: waiting for pending results... 30529 1726882690.87928: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882690.88016: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b1 30529 1726882690.88028: variable 'ansible_search_path' from source: unknown 30529 1726882690.88032: variable 'ansible_search_path' from source: unknown 30529 1726882690.88059: calling self._execute() 30529 1726882690.88136: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.88140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.88148: variable 'omit' from source: magic vars 30529 1726882690.88426: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.88436: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.88519: variable 'network_provider' from source: set_fact 30529 1726882690.88523: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882690.88525: when evaluation is False, skipping this task 30529 1726882690.88528: _execute() done 30529 1726882690.88530: dumping result to json 30529 1726882690.88532: done dumping result, returning 30529 1726882690.88538: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-0000000021b1] 30529 1726882690.88543: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b1 30529 1726882690.88630: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b1 30529 1726882690.88633: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882690.88680: no more pending results, returning what we have 30529 1726882690.88684: results queue empty 30529 1726882690.88685: checking for any_errors_fatal 30529 1726882690.88698: done checking for any_errors_fatal 30529 1726882690.88699: checking for max_fail_percentage 30529 1726882690.88700: done checking for max_fail_percentage 30529 1726882690.88701: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.88702: done checking to see if all hosts have failed 30529 1726882690.88703: getting the remaining hosts for this loop 30529 1726882690.88704: done getting the remaining hosts for this loop 30529 1726882690.88708: getting the next task for host managed_node1 30529 1726882690.88715: done getting next task for host managed_node1 30529 1726882690.88719: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882690.88723: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.88741: getting variables 30529 1726882690.88743: in VariableManager get_vars() 30529 1726882690.88787: Calling all_inventory to load vars for managed_node1 30529 1726882690.88790: Calling groups_inventory to load vars for managed_node1 30529 1726882690.88792: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.88802: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.88804: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.88807: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.89699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.90732: done with get_vars() 30529 1726882690.90749: done getting variables 30529 1726882690.90789: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:10 -0400 (0:00:00.033) 0:01:44.934 ****** 30529 1726882690.90816: entering _queue_task() for managed_node1/copy 30529 1726882690.91035: worker is 1 (out of 1 available) 30529 1726882690.91048: exiting _queue_task() for managed_node1/copy 30529 1726882690.91061: done queuing things up, now waiting for results queue to drain 30529 1726882690.91063: waiting for pending results... 30529 1726882690.91249: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882690.91337: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b2 30529 1726882690.91348: variable 'ansible_search_path' from source: unknown 30529 1726882690.91351: variable 'ansible_search_path' from source: unknown 30529 1726882690.91377: calling self._execute() 30529 1726882690.91455: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.91459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.91467: variable 'omit' from source: magic vars 30529 1726882690.91749: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.91758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.91987: variable 'network_provider' from source: set_fact 30529 1726882690.91991: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882690.91994: when evaluation is False, skipping this task 30529 1726882690.91996: _execute() done 30529 1726882690.91999: dumping result to json 30529 1726882690.92001: done dumping result, returning 30529 1726882690.92005: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-0000000021b2] 30529 1726882690.92007: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b2 30529 1726882690.92074: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b2 30529 1726882690.92077: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882690.92121: no more pending results, returning what we have 30529 1726882690.92124: results queue empty 30529 1726882690.92125: checking for any_errors_fatal 30529 1726882690.92130: done checking for any_errors_fatal 30529 1726882690.92130: checking for max_fail_percentage 30529 1726882690.92132: done checking for max_fail_percentage 30529 1726882690.92132: checking to see if all hosts have failed and the running result is not ok 30529 1726882690.92133: done checking to see if all hosts have failed 30529 1726882690.92134: getting the remaining hosts for this loop 30529 1726882690.92135: done getting the remaining hosts for this loop 30529 1726882690.92138: getting the next task for host managed_node1 30529 1726882690.92145: done getting next task for host managed_node1 30529 1726882690.92148: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882690.92152: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882690.92168: getting variables 30529 1726882690.92170: in VariableManager get_vars() 30529 1726882690.92217: Calling all_inventory to load vars for managed_node1 30529 1726882690.92220: Calling groups_inventory to load vars for managed_node1 30529 1726882690.92223: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882690.92232: Calling all_plugins_play to load vars for managed_node1 30529 1726882690.92235: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882690.92238: Calling groups_plugins_play to load vars for managed_node1 30529 1726882690.93551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882690.95190: done with get_vars() 30529 1726882690.95231: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:10 -0400 (0:00:00.045) 0:01:44.979 ****** 30529 1726882690.95331: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882690.95652: worker is 1 (out of 1 available) 30529 1726882690.95666: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882690.95680: done queuing things up, now waiting for results queue to drain 30529 1726882690.95682: waiting for pending results... 30529 1726882690.95887: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882690.95984: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b3 30529 1726882690.96001: variable 'ansible_search_path' from source: unknown 30529 1726882690.96005: variable 'ansible_search_path' from source: unknown 30529 1726882690.96034: calling self._execute() 30529 1726882690.96114: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882690.96117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882690.96128: variable 'omit' from source: magic vars 30529 1726882690.96412: variable 'ansible_distribution_major_version' from source: facts 30529 1726882690.96422: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882690.96429: variable 'omit' from source: magic vars 30529 1726882690.96477: variable 'omit' from source: magic vars 30529 1726882690.96591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882690.98998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882690.99008: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882690.99054: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882690.99090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882690.99133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882690.99210: variable 'network_provider' from source: set_fact 30529 1726882690.99365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882690.99400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882690.99430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882690.99485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882690.99508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882690.99658: variable 'omit' from source: magic vars 30529 1726882690.99714: variable 'omit' from source: magic vars 30529 1726882690.99828: variable 'network_connections' from source: include params 30529 1726882690.99844: variable 'interface' from source: play vars 30529 1726882690.99919: variable 'interface' from source: play vars 30529 1726882691.00086: variable 'omit' from source: magic vars 30529 1726882691.00112: variable '__lsr_ansible_managed' from source: task vars 30529 1726882691.00173: variable '__lsr_ansible_managed' from source: task vars 30529 1726882691.00359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882691.00596: Loaded config def from plugin (lookup/template) 30529 1726882691.00606: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882691.00645: File lookup term: get_ansible_managed.j2 30529 1726882691.00754: variable 'ansible_search_path' from source: unknown 30529 1726882691.00758: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882691.00762: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882691.00765: variable 'ansible_search_path' from source: unknown 30529 1726882691.06680: variable 'ansible_managed' from source: unknown 30529 1726882691.06820: variable 'omit' from source: magic vars 30529 1726882691.06846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882691.06871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882691.06889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882691.06910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.06922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.06998: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882691.07001: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.07004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.07047: Set connection var ansible_shell_executable to /bin/sh 30529 1726882691.07053: Set connection var ansible_pipelining to False 30529 1726882691.07055: Set connection var ansible_shell_type to sh 30529 1726882691.07149: Set connection var ansible_timeout to 10 30529 1726882691.07152: Set connection var ansible_connection to ssh 30529 1726882691.07154: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882691.07157: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.07159: variable 'ansible_connection' from source: unknown 30529 1726882691.07161: variable 'ansible_module_compression' from source: unknown 30529 1726882691.07163: variable 'ansible_shell_type' from source: unknown 30529 1726882691.07165: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.07167: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.07169: variable 'ansible_pipelining' from source: unknown 30529 1726882691.07172: variable 'ansible_timeout' from source: unknown 30529 1726882691.07174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.07256: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882691.07267: variable 'omit' from source: magic vars 30529 1726882691.07270: starting attempt loop 30529 1726882691.07273: running the handler 30529 1726882691.07289: _low_level_execute_command(): starting 30529 1726882691.07299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882691.07977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882691.07988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.08005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.08021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.08035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882691.08041: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882691.08049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.08063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882691.08125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882691.08128: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882691.08130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.08132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.08134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.08136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882691.08138: stderr chunk (state=3): >>>debug2: match found <<< 30529 1726882691.08140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.08188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.08205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.08233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.08312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.09948: stdout chunk (state=3): >>>/root <<< 30529 1726882691.10082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.10100: stderr chunk (state=3): >>><<< 30529 1726882691.10108: stdout chunk (state=3): >>><<< 30529 1726882691.10131: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.10209: _low_level_execute_command(): starting 30529 1726882691.10213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803 `" && echo ansible-tmp-1726882691.101369-35454-196509060771803="` echo /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803 `" ) && sleep 0' 30529 1726882691.10857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.10861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.10863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.10865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882691.10867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.10935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.10945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.10969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.11034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.12910: stdout chunk (state=3): >>>ansible-tmp-1726882691.101369-35454-196509060771803=/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803 <<< 30529 1726882691.13076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.13080: stdout chunk (state=3): >>><<< 30529 1726882691.13082: stderr chunk (state=3): >>><<< 30529 1726882691.13104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882691.101369-35454-196509060771803=/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.13156: variable 'ansible_module_compression' from source: unknown 30529 1726882691.13290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882691.13300: variable 'ansible_facts' from source: unknown 30529 1726882691.13424: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py 30529 1726882691.13650: Sending initial data 30529 1726882691.13659: Sent initial data (167 bytes) 30529 1726882691.14195: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882691.14304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.14323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.14338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.14412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.15992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882691.16251: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882691.16316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjq1okawt /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py <<< 30529 1726882691.16332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py" <<< 30529 1726882691.16388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjq1okawt" to remote "/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py" <<< 30529 1726882691.17379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.17411: stderr chunk (state=3): >>><<< 30529 1726882691.17424: stdout chunk (state=3): >>><<< 30529 1726882691.17471: done transferring module to remote 30529 1726882691.17486: _low_level_execute_command(): starting 30529 1726882691.17497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/ /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py && sleep 0' 30529 1726882691.18136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.18150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.18165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.18208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.18259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.18275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.18300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.18367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.20105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.20199: stderr chunk (state=3): >>><<< 30529 1726882691.20202: stdout chunk (state=3): >>><<< 30529 1726882691.20204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.20207: _low_level_execute_command(): starting 30529 1726882691.20209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/AnsiballZ_network_connections.py && sleep 0' 30529 1726882691.20843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.20866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.20879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.20958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.47928: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882691.51460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882691.51484: stderr chunk (state=3): >>><<< 30529 1726882691.51487: stdout chunk (state=3): >>><<< 30529 1726882691.51508: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882691.51543: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882691.51551: _low_level_execute_command(): starting 30529 1726882691.51556: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882691.101369-35454-196509060771803/ > /dev/null 2>&1 && sleep 0' 30529 1726882691.51974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.51983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.52007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.52010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.52068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.52073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.52076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.52120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.54096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.54125: stderr chunk (state=3): >>><<< 30529 1726882691.54128: stdout chunk (state=3): >>><<< 30529 1726882691.54139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.54145: handler run complete 30529 1726882691.54165: attempt loop complete, returning result 30529 1726882691.54168: _execute() done 30529 1726882691.54170: dumping result to json 30529 1726882691.54175: done dumping result, returning 30529 1726882691.54183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-0000000021b3] 30529 1726882691.54194: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b3 30529 1726882691.54292: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b3 30529 1726882691.54298: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 30529 1726882691.54421: no more pending results, returning what we have 30529 1726882691.54424: results queue empty 30529 1726882691.54425: checking for any_errors_fatal 30529 1726882691.54435: done checking for any_errors_fatal 30529 1726882691.54436: checking for max_fail_percentage 30529 1726882691.54437: done checking for max_fail_percentage 30529 1726882691.54438: checking to see if all hosts have failed and the running result is not ok 30529 1726882691.54439: done checking to see if all hosts have failed 30529 1726882691.54440: getting the remaining hosts for this loop 30529 1726882691.54441: done getting the remaining hosts for this loop 30529 1726882691.54445: getting the next task for host managed_node1 30529 1726882691.54452: done getting next task for host managed_node1 30529 1726882691.54455: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882691.54460: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882691.54472: getting variables 30529 1726882691.54473: in VariableManager get_vars() 30529 1726882691.54521: Calling all_inventory to load vars for managed_node1 30529 1726882691.54524: Calling groups_inventory to load vars for managed_node1 30529 1726882691.54526: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882691.54535: Calling all_plugins_play to load vars for managed_node1 30529 1726882691.54537: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882691.54540: Calling groups_plugins_play to load vars for managed_node1 30529 1726882691.55483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882691.56346: done with get_vars() 30529 1726882691.56362: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:11 -0400 (0:00:00.610) 0:01:45.590 ****** 30529 1726882691.56428: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882691.56667: worker is 1 (out of 1 available) 30529 1726882691.56681: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882691.56697: done queuing things up, now waiting for results queue to drain 30529 1726882691.56699: waiting for pending results... 30529 1726882691.56886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882691.56987: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b4 30529 1726882691.57003: variable 'ansible_search_path' from source: unknown 30529 1726882691.57007: variable 'ansible_search_path' from source: unknown 30529 1726882691.57036: calling self._execute() 30529 1726882691.57110: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.57114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.57122: variable 'omit' from source: magic vars 30529 1726882691.57412: variable 'ansible_distribution_major_version' from source: facts 30529 1726882691.57421: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882691.57508: variable 'network_state' from source: role '' defaults 30529 1726882691.57517: Evaluated conditional (network_state != {}): False 30529 1726882691.57520: when evaluation is False, skipping this task 30529 1726882691.57529: _execute() done 30529 1726882691.57531: dumping result to json 30529 1726882691.57534: done dumping result, returning 30529 1726882691.57553: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-0000000021b4] 30529 1726882691.57557: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b4 30529 1726882691.57651: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b4 30529 1726882691.57654: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882691.57740: no more pending results, returning what we have 30529 1726882691.57743: results queue empty 30529 1726882691.57744: checking for any_errors_fatal 30529 1726882691.57752: done checking for any_errors_fatal 30529 1726882691.57752: checking for max_fail_percentage 30529 1726882691.57754: done checking for max_fail_percentage 30529 1726882691.57755: checking to see if all hosts have failed and the running result is not ok 30529 1726882691.57755: done checking to see if all hosts have failed 30529 1726882691.57756: getting the remaining hosts for this loop 30529 1726882691.57757: done getting the remaining hosts for this loop 30529 1726882691.57760: getting the next task for host managed_node1 30529 1726882691.57767: done getting next task for host managed_node1 30529 1726882691.57770: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882691.57775: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882691.57792: getting variables 30529 1726882691.57796: in VariableManager get_vars() 30529 1726882691.57830: Calling all_inventory to load vars for managed_node1 30529 1726882691.57832: Calling groups_inventory to load vars for managed_node1 30529 1726882691.57834: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882691.57842: Calling all_plugins_play to load vars for managed_node1 30529 1726882691.57844: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882691.57846: Calling groups_plugins_play to load vars for managed_node1 30529 1726882691.58587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882691.59905: done with get_vars() 30529 1726882691.59926: done getting variables 30529 1726882691.59983: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:11 -0400 (0:00:00.035) 0:01:45.626 ****** 30529 1726882691.60019: entering _queue_task() for managed_node1/debug 30529 1726882691.60289: worker is 1 (out of 1 available) 30529 1726882691.60305: exiting _queue_task() for managed_node1/debug 30529 1726882691.60319: done queuing things up, now waiting for results queue to drain 30529 1726882691.60321: waiting for pending results... 30529 1726882691.60518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882691.60618: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b5 30529 1726882691.60632: variable 'ansible_search_path' from source: unknown 30529 1726882691.60635: variable 'ansible_search_path' from source: unknown 30529 1726882691.60662: calling self._execute() 30529 1726882691.60736: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.60740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.60751: variable 'omit' from source: magic vars 30529 1726882691.61022: variable 'ansible_distribution_major_version' from source: facts 30529 1726882691.61031: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882691.61037: variable 'omit' from source: magic vars 30529 1726882691.61084: variable 'omit' from source: magic vars 30529 1726882691.61111: variable 'omit' from source: magic vars 30529 1726882691.61141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882691.61169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882691.61185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882691.61203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.61214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.61237: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882691.61240: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.61243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.61323: Set connection var ansible_shell_executable to /bin/sh 30529 1726882691.61326: Set connection var ansible_pipelining to False 30529 1726882691.61329: Set connection var ansible_shell_type to sh 30529 1726882691.61337: Set connection var ansible_timeout to 10 30529 1726882691.61339: Set connection var ansible_connection to ssh 30529 1726882691.61344: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882691.61360: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.61364: variable 'ansible_connection' from source: unknown 30529 1726882691.61366: variable 'ansible_module_compression' from source: unknown 30529 1726882691.61369: variable 'ansible_shell_type' from source: unknown 30529 1726882691.61373: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.61375: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.61377: variable 'ansible_pipelining' from source: unknown 30529 1726882691.61380: variable 'ansible_timeout' from source: unknown 30529 1726882691.61382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.61482: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882691.61498: variable 'omit' from source: magic vars 30529 1726882691.61503: starting attempt loop 30529 1726882691.61506: running the handler 30529 1726882691.61596: variable '__network_connections_result' from source: set_fact 30529 1726882691.61638: handler run complete 30529 1726882691.61652: attempt loop complete, returning result 30529 1726882691.61655: _execute() done 30529 1726882691.61657: dumping result to json 30529 1726882691.61659: done dumping result, returning 30529 1726882691.61668: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000021b5] 30529 1726882691.61670: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b5 30529 1726882691.61756: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b5 30529 1726882691.61759: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67" ] } 30529 1726882691.61826: no more pending results, returning what we have 30529 1726882691.61830: results queue empty 30529 1726882691.61831: checking for any_errors_fatal 30529 1726882691.61837: done checking for any_errors_fatal 30529 1726882691.61838: checking for max_fail_percentage 30529 1726882691.61839: done checking for max_fail_percentage 30529 1726882691.61841: checking to see if all hosts have failed and the running result is not ok 30529 1726882691.61841: done checking to see if all hosts have failed 30529 1726882691.61842: getting the remaining hosts for this loop 30529 1726882691.61844: done getting the remaining hosts for this loop 30529 1726882691.61847: getting the next task for host managed_node1 30529 1726882691.61854: done getting next task for host managed_node1 30529 1726882691.61858: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882691.61863: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882691.61885: getting variables 30529 1726882691.61888: in VariableManager get_vars() 30529 1726882691.61932: Calling all_inventory to load vars for managed_node1 30529 1726882691.61935: Calling groups_inventory to load vars for managed_node1 30529 1726882691.61937: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882691.61946: Calling all_plugins_play to load vars for managed_node1 30529 1726882691.61949: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882691.61951: Calling groups_plugins_play to load vars for managed_node1 30529 1726882691.63409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882691.65029: done with get_vars() 30529 1726882691.65051: done getting variables 30529 1726882691.65115: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:11 -0400 (0:00:00.051) 0:01:45.677 ****** 30529 1726882691.65155: entering _queue_task() for managed_node1/debug 30529 1726882691.65563: worker is 1 (out of 1 available) 30529 1726882691.65573: exiting _queue_task() for managed_node1/debug 30529 1726882691.65584: done queuing things up, now waiting for results queue to drain 30529 1726882691.65585: waiting for pending results... 30529 1726882691.65881: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882691.65979: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b6 30529 1726882691.65983: variable 'ansible_search_path' from source: unknown 30529 1726882691.65985: variable 'ansible_search_path' from source: unknown 30529 1726882691.66017: calling self._execute() 30529 1726882691.66126: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.66200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.66204: variable 'omit' from source: magic vars 30529 1726882691.66571: variable 'ansible_distribution_major_version' from source: facts 30529 1726882691.66590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882691.66604: variable 'omit' from source: magic vars 30529 1726882691.66683: variable 'omit' from source: magic vars 30529 1726882691.66722: variable 'omit' from source: magic vars 30529 1726882691.66775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882691.66821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882691.66854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882691.66878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.66956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.66960: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882691.66963: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.66965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.67067: Set connection var ansible_shell_executable to /bin/sh 30529 1726882691.67079: Set connection var ansible_pipelining to False 30529 1726882691.67087: Set connection var ansible_shell_type to sh 30529 1726882691.67104: Set connection var ansible_timeout to 10 30529 1726882691.67112: Set connection var ansible_connection to ssh 30529 1726882691.67123: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882691.67148: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.67156: variable 'ansible_connection' from source: unknown 30529 1726882691.67164: variable 'ansible_module_compression' from source: unknown 30529 1726882691.67282: variable 'ansible_shell_type' from source: unknown 30529 1726882691.67285: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.67287: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.67289: variable 'ansible_pipelining' from source: unknown 30529 1726882691.67291: variable 'ansible_timeout' from source: unknown 30529 1726882691.67295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.67353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882691.67371: variable 'omit' from source: magic vars 30529 1726882691.67381: starting attempt loop 30529 1726882691.67387: running the handler 30529 1726882691.67448: variable '__network_connections_result' from source: set_fact 30529 1726882691.67534: variable '__network_connections_result' from source: set_fact 30529 1726882691.67661: handler run complete 30529 1726882691.67738: attempt loop complete, returning result 30529 1726882691.67741: _execute() done 30529 1726882691.67744: dumping result to json 30529 1726882691.67746: done dumping result, returning 30529 1726882691.67748: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000021b6] 30529 1726882691.67751: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b6 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67" ] } } 30529 1726882691.67934: no more pending results, returning what we have 30529 1726882691.67938: results queue empty 30529 1726882691.67939: checking for any_errors_fatal 30529 1726882691.67950: done checking for any_errors_fatal 30529 1726882691.67951: checking for max_fail_percentage 30529 1726882691.67953: done checking for max_fail_percentage 30529 1726882691.67955: checking to see if all hosts have failed and the running result is not ok 30529 1726882691.67956: done checking to see if all hosts have failed 30529 1726882691.67956: getting the remaining hosts for this loop 30529 1726882691.67958: done getting the remaining hosts for this loop 30529 1726882691.67962: getting the next task for host managed_node1 30529 1726882691.67971: done getting next task for host managed_node1 30529 1726882691.67975: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882691.67980: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882691.67992: getting variables 30529 1726882691.68100: in VariableManager get_vars() 30529 1726882691.68146: Calling all_inventory to load vars for managed_node1 30529 1726882691.68148: Calling groups_inventory to load vars for managed_node1 30529 1726882691.68211: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b6 30529 1726882691.68324: WORKER PROCESS EXITING 30529 1726882691.68318: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882691.68335: Calling all_plugins_play to load vars for managed_node1 30529 1726882691.68338: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882691.68342: Calling groups_plugins_play to load vars for managed_node1 30529 1726882691.69866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882691.76361: done with get_vars() 30529 1726882691.76387: done getting variables 30529 1726882691.76438: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:11 -0400 (0:00:00.113) 0:01:45.790 ****** 30529 1726882691.76472: entering _queue_task() for managed_node1/debug 30529 1726882691.76865: worker is 1 (out of 1 available) 30529 1726882691.76881: exiting _queue_task() for managed_node1/debug 30529 1726882691.76898: done queuing things up, now waiting for results queue to drain 30529 1726882691.76901: waiting for pending results... 30529 1726882691.77110: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882691.77226: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b7 30529 1726882691.77237: variable 'ansible_search_path' from source: unknown 30529 1726882691.77241: variable 'ansible_search_path' from source: unknown 30529 1726882691.77272: calling self._execute() 30529 1726882691.77351: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.77357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.77365: variable 'omit' from source: magic vars 30529 1726882691.77654: variable 'ansible_distribution_major_version' from source: facts 30529 1726882691.77663: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882691.77754: variable 'network_state' from source: role '' defaults 30529 1726882691.77764: Evaluated conditional (network_state != {}): False 30529 1726882691.77767: when evaluation is False, skipping this task 30529 1726882691.77770: _execute() done 30529 1726882691.77775: dumping result to json 30529 1726882691.77778: done dumping result, returning 30529 1726882691.77782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-0000000021b7] 30529 1726882691.77792: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b7 30529 1726882691.77880: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b7 30529 1726882691.77883: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882691.77941: no more pending results, returning what we have 30529 1726882691.77945: results queue empty 30529 1726882691.77946: checking for any_errors_fatal 30529 1726882691.77955: done checking for any_errors_fatal 30529 1726882691.77956: checking for max_fail_percentage 30529 1726882691.77958: done checking for max_fail_percentage 30529 1726882691.77959: checking to see if all hosts have failed and the running result is not ok 30529 1726882691.77960: done checking to see if all hosts have failed 30529 1726882691.77960: getting the remaining hosts for this loop 30529 1726882691.77962: done getting the remaining hosts for this loop 30529 1726882691.77966: getting the next task for host managed_node1 30529 1726882691.77975: done getting next task for host managed_node1 30529 1726882691.77978: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882691.77982: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882691.78008: getting variables 30529 1726882691.78010: in VariableManager get_vars() 30529 1726882691.78050: Calling all_inventory to load vars for managed_node1 30529 1726882691.78052: Calling groups_inventory to load vars for managed_node1 30529 1726882691.78054: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882691.78064: Calling all_plugins_play to load vars for managed_node1 30529 1726882691.78067: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882691.78069: Calling groups_plugins_play to load vars for managed_node1 30529 1726882691.79213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882691.80567: done with get_vars() 30529 1726882691.80585: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:11 -0400 (0:00:00.041) 0:01:45.832 ****** 30529 1726882691.80654: entering _queue_task() for managed_node1/ping 30529 1726882691.80902: worker is 1 (out of 1 available) 30529 1726882691.80917: exiting _queue_task() for managed_node1/ping 30529 1726882691.80929: done queuing things up, now waiting for results queue to drain 30529 1726882691.80931: waiting for pending results... 30529 1726882691.81110: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882691.81205: in run() - task 12673a56-9f93-b0f1-edc0-0000000021b8 30529 1726882691.81217: variable 'ansible_search_path' from source: unknown 30529 1726882691.81221: variable 'ansible_search_path' from source: unknown 30529 1726882691.81249: calling self._execute() 30529 1726882691.81328: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.81332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.81339: variable 'omit' from source: magic vars 30529 1726882691.81627: variable 'ansible_distribution_major_version' from source: facts 30529 1726882691.81637: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882691.81643: variable 'omit' from source: magic vars 30529 1726882691.81682: variable 'omit' from source: magic vars 30529 1726882691.81710: variable 'omit' from source: magic vars 30529 1726882691.81738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882691.81764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882691.81780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882691.81795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.81807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882691.81830: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882691.81833: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.81836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.81906: Set connection var ansible_shell_executable to /bin/sh 30529 1726882691.81910: Set connection var ansible_pipelining to False 30529 1726882691.81913: Set connection var ansible_shell_type to sh 30529 1726882691.81925: Set connection var ansible_timeout to 10 30529 1726882691.81928: Set connection var ansible_connection to ssh 30529 1726882691.81931: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882691.81946: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.81948: variable 'ansible_connection' from source: unknown 30529 1726882691.81951: variable 'ansible_module_compression' from source: unknown 30529 1726882691.81954: variable 'ansible_shell_type' from source: unknown 30529 1726882691.81956: variable 'ansible_shell_executable' from source: unknown 30529 1726882691.81958: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882691.81961: variable 'ansible_pipelining' from source: unknown 30529 1726882691.81964: variable 'ansible_timeout' from source: unknown 30529 1726882691.81968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882691.82207: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882691.82212: variable 'omit' from source: magic vars 30529 1726882691.82214: starting attempt loop 30529 1726882691.82217: running the handler 30529 1726882691.82230: _low_level_execute_command(): starting 30529 1726882691.82239: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882691.82877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882691.82992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.82997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.82999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.83002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882691.83003: stderr chunk (state=3): >>>debug2: match not found <<< 30529 1726882691.83005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.83007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882691.83009: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882691.83011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30529 1726882691.83012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.83014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.83016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882691.83081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.83084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.83098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.83164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.84812: stdout chunk (state=3): >>>/root <<< 30529 1726882691.84914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.84937: stderr chunk (state=3): >>><<< 30529 1726882691.84942: stdout chunk (state=3): >>><<< 30529 1726882691.84966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.84974: _low_level_execute_command(): starting 30529 1726882691.84978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687 `" && echo ansible-tmp-1726882691.8496153-35485-101668721486687="` echo /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687 `" ) && sleep 0' 30529 1726882691.85376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.85390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882691.85422: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.85425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.85428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882691.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.85474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.85477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.85532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.87364: stdout chunk (state=3): >>>ansible-tmp-1726882691.8496153-35485-101668721486687=/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687 <<< 30529 1726882691.87467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.87492: stderr chunk (state=3): >>><<< 30529 1726882691.87497: stdout chunk (state=3): >>><<< 30529 1726882691.87512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882691.8496153-35485-101668721486687=/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.87551: variable 'ansible_module_compression' from source: unknown 30529 1726882691.87583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882691.87616: variable 'ansible_facts' from source: unknown 30529 1726882691.87671: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py 30529 1726882691.87771: Sending initial data 30529 1726882691.87774: Sent initial data (153 bytes) 30529 1726882691.88182: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.88220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882691.88223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.88226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882691.88228: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.88271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.88274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.88321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.89836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882691.89841: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882691.89875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882691.89925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp62ynd3bf /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py <<< 30529 1726882691.89927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py" <<< 30529 1726882691.89960: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp62ynd3bf" to remote "/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py" <<< 30529 1726882691.89966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py" <<< 30529 1726882691.90474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.90514: stderr chunk (state=3): >>><<< 30529 1726882691.90518: stdout chunk (state=3): >>><<< 30529 1726882691.90534: done transferring module to remote 30529 1726882691.90543: _low_level_execute_command(): starting 30529 1726882691.90547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/ /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py && sleep 0' 30529 1726882691.90965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882691.90968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882691.90974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.90977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.90978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.91026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.91029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.91074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882691.92772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882691.92791: stderr chunk (state=3): >>><<< 30529 1726882691.92796: stdout chunk (state=3): >>><<< 30529 1726882691.92807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882691.92809: _low_level_execute_command(): starting 30529 1726882691.92815: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/AnsiballZ_ping.py && sleep 0' 30529 1726882691.93230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.93233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882691.93235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.93237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882691.93239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882691.93285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882691.93288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882691.93294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882691.93338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.07996: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882692.09203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882692.09231: stderr chunk (state=3): >>><<< 30529 1726882692.09234: stdout chunk (state=3): >>><<< 30529 1726882692.09253: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882692.09274: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882692.09286: _low_level_execute_command(): starting 30529 1726882692.09294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882691.8496153-35485-101668721486687/ > /dev/null 2>&1 && sleep 0' 30529 1726882692.09749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.09753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.09755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882692.09758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882692.09765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.09814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882692.09817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.09865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.11813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882692.11816: stdout chunk (state=3): >>><<< 30529 1726882692.11819: stderr chunk (state=3): >>><<< 30529 1726882692.11821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882692.11823: handler run complete 30529 1726882692.11825: attempt loop complete, returning result 30529 1726882692.11826: _execute() done 30529 1726882692.11828: dumping result to json 30529 1726882692.11829: done dumping result, returning 30529 1726882692.11831: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-0000000021b8] 30529 1726882692.11832: sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b8 ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882692.11962: no more pending results, returning what we have 30529 1726882692.11966: results queue empty 30529 1726882692.11967: checking for any_errors_fatal 30529 1726882692.12102: done checking for any_errors_fatal 30529 1726882692.12103: checking for max_fail_percentage 30529 1726882692.12105: done checking for max_fail_percentage 30529 1726882692.12106: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.12107: done checking to see if all hosts have failed 30529 1726882692.12108: getting the remaining hosts for this loop 30529 1726882692.12110: done getting the remaining hosts for this loop 30529 1726882692.12113: getting the next task for host managed_node1 30529 1726882692.12126: done getting next task for host managed_node1 30529 1726882692.12128: ^ task is: TASK: meta (role_complete) 30529 1726882692.12133: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.12146: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000021b8 30529 1726882692.12150: WORKER PROCESS EXITING 30529 1726882692.12157: getting variables 30529 1726882692.12159: in VariableManager get_vars() 30529 1726882692.12212: Calling all_inventory to load vars for managed_node1 30529 1726882692.12215: Calling groups_inventory to load vars for managed_node1 30529 1726882692.12217: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.12228: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.12232: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.12235: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.13505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.14361: done with get_vars() 30529 1726882692.14376: done getting variables 30529 1726882692.14439: done queuing things up, now waiting for results queue to drain 30529 1726882692.14441: results queue empty 30529 1726882692.14441: checking for any_errors_fatal 30529 1726882692.14443: done checking for any_errors_fatal 30529 1726882692.14443: checking for max_fail_percentage 30529 1726882692.14444: done checking for max_fail_percentage 30529 1726882692.14445: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.14445: done checking to see if all hosts have failed 30529 1726882692.14445: getting the remaining hosts for this loop 30529 1726882692.14446: done getting the remaining hosts for this loop 30529 1726882692.14448: getting the next task for host managed_node1 30529 1726882692.14452: done getting next task for host managed_node1 30529 1726882692.14454: ^ task is: TASK: Show result 30529 1726882692.14456: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.14457: getting variables 30529 1726882692.14458: in VariableManager get_vars() 30529 1726882692.14466: Calling all_inventory to load vars for managed_node1 30529 1726882692.14467: Calling groups_inventory to load vars for managed_node1 30529 1726882692.14469: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.14472: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.14473: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.14475: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.15512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.17097: done with get_vars() 30529 1726882692.17120: done getting variables 30529 1726882692.17160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:38:12 -0400 (0:00:00.365) 0:01:46.198 ****** 30529 1726882692.17195: entering _queue_task() for managed_node1/debug 30529 1726882692.17528: worker is 1 (out of 1 available) 30529 1726882692.17541: exiting _queue_task() for managed_node1/debug 30529 1726882692.17553: done queuing things up, now waiting for results queue to drain 30529 1726882692.17555: waiting for pending results... 30529 1726882692.17849: running TaskExecutor() for managed_node1/TASK: Show result 30529 1726882692.17970: in run() - task 12673a56-9f93-b0f1-edc0-00000000213a 30529 1726882692.18000: variable 'ansible_search_path' from source: unknown 30529 1726882692.18004: variable 'ansible_search_path' from source: unknown 30529 1726882692.18035: calling self._execute() 30529 1726882692.18128: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.18135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.18142: variable 'omit' from source: magic vars 30529 1726882692.18544: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.18555: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.18568: variable 'omit' from source: magic vars 30529 1726882692.18624: variable 'omit' from source: magic vars 30529 1726882692.18655: variable 'omit' from source: magic vars 30529 1726882692.18699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882692.18735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882692.18756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882692.18794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882692.18799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882692.18823: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882692.18826: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.18830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.18999: Set connection var ansible_shell_executable to /bin/sh 30529 1726882692.19007: Set connection var ansible_pipelining to False 30529 1726882692.19010: Set connection var ansible_shell_type to sh 30529 1726882692.19013: Set connection var ansible_timeout to 10 30529 1726882692.19015: Set connection var ansible_connection to ssh 30529 1726882692.19018: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882692.19020: variable 'ansible_shell_executable' from source: unknown 30529 1726882692.19022: variable 'ansible_connection' from source: unknown 30529 1726882692.19024: variable 'ansible_module_compression' from source: unknown 30529 1726882692.19027: variable 'ansible_shell_type' from source: unknown 30529 1726882692.19029: variable 'ansible_shell_executable' from source: unknown 30529 1726882692.19031: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.19033: variable 'ansible_pipelining' from source: unknown 30529 1726882692.19035: variable 'ansible_timeout' from source: unknown 30529 1726882692.19037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.19142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882692.19154: variable 'omit' from source: magic vars 30529 1726882692.19225: starting attempt loop 30529 1726882692.19229: running the handler 30529 1726882692.19232: variable '__network_connections_result' from source: set_fact 30529 1726882692.19297: variable '__network_connections_result' from source: set_fact 30529 1726882692.19419: handler run complete 30529 1726882692.19448: attempt loop complete, returning result 30529 1726882692.19451: _execute() done 30529 1726882692.19453: dumping result to json 30529 1726882692.19456: done dumping result, returning 30529 1726882692.19464: done running TaskExecutor() for managed_node1/TASK: Show result [12673a56-9f93-b0f1-edc0-00000000213a] 30529 1726882692.19467: sending task result for task 12673a56-9f93-b0f1-edc0-00000000213a 30529 1726882692.19682: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000213a 30529 1726882692.19686: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67" ] } } 30529 1726882692.19757: no more pending results, returning what we have 30529 1726882692.19760: results queue empty 30529 1726882692.19761: checking for any_errors_fatal 30529 1726882692.19763: done checking for any_errors_fatal 30529 1726882692.19764: checking for max_fail_percentage 30529 1726882692.19765: done checking for max_fail_percentage 30529 1726882692.19766: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.19767: done checking to see if all hosts have failed 30529 1726882692.19768: getting the remaining hosts for this loop 30529 1726882692.19770: done getting the remaining hosts for this loop 30529 1726882692.19774: getting the next task for host managed_node1 30529 1726882692.19785: done getting next task for host managed_node1 30529 1726882692.19791: ^ task is: TASK: Include network role 30529 1726882692.19795: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.19800: getting variables 30529 1726882692.19802: in VariableManager get_vars() 30529 1726882692.19837: Calling all_inventory to load vars for managed_node1 30529 1726882692.19839: Calling groups_inventory to load vars for managed_node1 30529 1726882692.19843: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.19853: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.19856: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.19859: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.21426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.23002: done with get_vars() 30529 1726882692.23026: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:38:12 -0400 (0:00:00.059) 0:01:46.257 ****** 30529 1726882692.23124: entering _queue_task() for managed_node1/include_role 30529 1726882692.23455: worker is 1 (out of 1 available) 30529 1726882692.23466: exiting _queue_task() for managed_node1/include_role 30529 1726882692.23478: done queuing things up, now waiting for results queue to drain 30529 1726882692.23479: waiting for pending results... 30529 1726882692.23780: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882692.23931: in run() - task 12673a56-9f93-b0f1-edc0-00000000213e 30529 1726882692.23945: variable 'ansible_search_path' from source: unknown 30529 1726882692.23949: variable 'ansible_search_path' from source: unknown 30529 1726882692.23981: calling self._execute() 30529 1726882692.24099: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.24103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.24106: variable 'omit' from source: magic vars 30529 1726882692.24480: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.24699: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.24702: _execute() done 30529 1726882692.24705: dumping result to json 30529 1726882692.24707: done dumping result, returning 30529 1726882692.24709: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-00000000213e] 30529 1726882692.24711: sending task result for task 12673a56-9f93-b0f1-edc0-00000000213e 30529 1726882692.24780: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000213e 30529 1726882692.24783: WORKER PROCESS EXITING 30529 1726882692.24860: no more pending results, returning what we have 30529 1726882692.24865: in VariableManager get_vars() 30529 1726882692.24906: Calling all_inventory to load vars for managed_node1 30529 1726882692.24908: Calling groups_inventory to load vars for managed_node1 30529 1726882692.24911: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.24920: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.24923: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.24926: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.26202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.27978: done with get_vars() 30529 1726882692.28004: variable 'ansible_search_path' from source: unknown 30529 1726882692.28005: variable 'ansible_search_path' from source: unknown 30529 1726882692.28153: variable 'omit' from source: magic vars 30529 1726882692.28201: variable 'omit' from source: magic vars 30529 1726882692.28217: variable 'omit' from source: magic vars 30529 1726882692.28220: we have included files to process 30529 1726882692.28221: generating all_blocks data 30529 1726882692.28224: done generating all_blocks data 30529 1726882692.28229: processing included file: fedora.linux_system_roles.network 30529 1726882692.28251: in VariableManager get_vars() 30529 1726882692.28266: done with get_vars() 30529 1726882692.28298: in VariableManager get_vars() 30529 1726882692.28319: done with get_vars() 30529 1726882692.28358: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882692.28499: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882692.28570: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882692.28968: in VariableManager get_vars() 30529 1726882692.28990: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882692.30894: iterating over new_blocks loaded from include file 30529 1726882692.30897: in VariableManager get_vars() 30529 1726882692.30918: done with get_vars() 30529 1726882692.30920: filtering new block on tags 30529 1726882692.31220: done filtering new block on tags 30529 1726882692.31224: in VariableManager get_vars() 30529 1726882692.31242: done with get_vars() 30529 1726882692.31243: filtering new block on tags 30529 1726882692.31259: done filtering new block on tags 30529 1726882692.31261: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882692.31267: extending task lists for all hosts with included blocks 30529 1726882692.31383: done extending task lists 30529 1726882692.31384: done processing included files 30529 1726882692.31385: results queue empty 30529 1726882692.31386: checking for any_errors_fatal 30529 1726882692.31395: done checking for any_errors_fatal 30529 1726882692.31396: checking for max_fail_percentage 30529 1726882692.31397: done checking for max_fail_percentage 30529 1726882692.31398: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.31399: done checking to see if all hosts have failed 30529 1726882692.31400: getting the remaining hosts for this loop 30529 1726882692.31401: done getting the remaining hosts for this loop 30529 1726882692.31404: getting the next task for host managed_node1 30529 1726882692.31409: done getting next task for host managed_node1 30529 1726882692.31412: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882692.31415: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.31427: getting variables 30529 1726882692.31428: in VariableManager get_vars() 30529 1726882692.31442: Calling all_inventory to load vars for managed_node1 30529 1726882692.31445: Calling groups_inventory to load vars for managed_node1 30529 1726882692.31447: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.31453: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.31455: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.31458: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.32625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.34178: done with get_vars() 30529 1726882692.34209: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:12 -0400 (0:00:00.111) 0:01:46.368 ****** 30529 1726882692.34286: entering _queue_task() for managed_node1/include_tasks 30529 1726882692.34670: worker is 1 (out of 1 available) 30529 1726882692.34682: exiting _queue_task() for managed_node1/include_tasks 30529 1726882692.34699: done queuing things up, now waiting for results queue to drain 30529 1726882692.34700: waiting for pending results... 30529 1726882692.35001: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882692.35134: in run() - task 12673a56-9f93-b0f1-edc0-000000002328 30529 1726882692.35202: variable 'ansible_search_path' from source: unknown 30529 1726882692.35206: variable 'ansible_search_path' from source: unknown 30529 1726882692.35210: calling self._execute() 30529 1726882692.35287: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.35296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.35309: variable 'omit' from source: magic vars 30529 1726882692.35697: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.35710: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.35716: _execute() done 30529 1726882692.35719: dumping result to json 30529 1726882692.35722: done dumping result, returning 30529 1726882692.35729: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000002328] 30529 1726882692.35744: sending task result for task 12673a56-9f93-b0f1-edc0-000000002328 30529 1726882692.36115: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002328 30529 1726882692.36120: WORKER PROCESS EXITING 30529 1726882692.36160: no more pending results, returning what we have 30529 1726882692.36166: in VariableManager get_vars() 30529 1726882692.36215: Calling all_inventory to load vars for managed_node1 30529 1726882692.36219: Calling groups_inventory to load vars for managed_node1 30529 1726882692.36221: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.36230: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.36233: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.36236: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.37629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.39139: done with get_vars() 30529 1726882692.39162: variable 'ansible_search_path' from source: unknown 30529 1726882692.39163: variable 'ansible_search_path' from source: unknown 30529 1726882692.39207: we have included files to process 30529 1726882692.39209: generating all_blocks data 30529 1726882692.39210: done generating all_blocks data 30529 1726882692.39213: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882692.39215: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882692.39217: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882692.39812: done processing included file 30529 1726882692.39814: iterating over new_blocks loaded from include file 30529 1726882692.39815: in VariableManager get_vars() 30529 1726882692.39842: done with get_vars() 30529 1726882692.39844: filtering new block on tags 30529 1726882692.39873: done filtering new block on tags 30529 1726882692.39876: in VariableManager get_vars() 30529 1726882692.39904: done with get_vars() 30529 1726882692.39906: filtering new block on tags 30529 1726882692.39948: done filtering new block on tags 30529 1726882692.39951: in VariableManager get_vars() 30529 1726882692.39975: done with get_vars() 30529 1726882692.39977: filtering new block on tags 30529 1726882692.40026: done filtering new block on tags 30529 1726882692.40028: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882692.40033: extending task lists for all hosts with included blocks 30529 1726882692.41589: done extending task lists 30529 1726882692.41590: done processing included files 30529 1726882692.41590: results queue empty 30529 1726882692.41591: checking for any_errors_fatal 30529 1726882692.41595: done checking for any_errors_fatal 30529 1726882692.41595: checking for max_fail_percentage 30529 1726882692.41596: done checking for max_fail_percentage 30529 1726882692.41597: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.41597: done checking to see if all hosts have failed 30529 1726882692.41598: getting the remaining hosts for this loop 30529 1726882692.41599: done getting the remaining hosts for this loop 30529 1726882692.41600: getting the next task for host managed_node1 30529 1726882692.41604: done getting next task for host managed_node1 30529 1726882692.41606: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882692.41608: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.41616: getting variables 30529 1726882692.41617: in VariableManager get_vars() 30529 1726882692.41626: Calling all_inventory to load vars for managed_node1 30529 1726882692.41628: Calling groups_inventory to load vars for managed_node1 30529 1726882692.41629: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.41633: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.41634: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.41636: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.42339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.43236: done with get_vars() 30529 1726882692.43254: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:12 -0400 (0:00:00.090) 0:01:46.459 ****** 30529 1726882692.43323: entering _queue_task() for managed_node1/setup 30529 1726882692.43609: worker is 1 (out of 1 available) 30529 1726882692.43625: exiting _queue_task() for managed_node1/setup 30529 1726882692.43638: done queuing things up, now waiting for results queue to drain 30529 1726882692.43640: waiting for pending results... 30529 1726882692.43822: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882692.43928: in run() - task 12673a56-9f93-b0f1-edc0-00000000237f 30529 1726882692.43941: variable 'ansible_search_path' from source: unknown 30529 1726882692.43945: variable 'ansible_search_path' from source: unknown 30529 1726882692.43972: calling self._execute() 30529 1726882692.44041: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.44045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.44053: variable 'omit' from source: magic vars 30529 1726882692.44402: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.44405: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.44663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882692.46091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882692.46132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882692.46160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882692.46189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882692.46211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882692.46267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882692.46291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882692.46310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882692.46336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882692.46347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882692.46382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882692.46400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882692.46421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882692.46445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882692.46455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882692.46568: variable '__network_required_facts' from source: role '' defaults 30529 1726882692.46575: variable 'ansible_facts' from source: unknown 30529 1726882692.47016: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882692.47020: when evaluation is False, skipping this task 30529 1726882692.47023: _execute() done 30529 1726882692.47025: dumping result to json 30529 1726882692.47028: done dumping result, returning 30529 1726882692.47034: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-00000000237f] 30529 1726882692.47038: sending task result for task 12673a56-9f93-b0f1-edc0-00000000237f 30529 1726882692.47127: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000237f 30529 1726882692.47130: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882692.47210: no more pending results, returning what we have 30529 1726882692.47215: results queue empty 30529 1726882692.47216: checking for any_errors_fatal 30529 1726882692.47217: done checking for any_errors_fatal 30529 1726882692.47218: checking for max_fail_percentage 30529 1726882692.47219: done checking for max_fail_percentage 30529 1726882692.47220: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.47221: done checking to see if all hosts have failed 30529 1726882692.47222: getting the remaining hosts for this loop 30529 1726882692.47223: done getting the remaining hosts for this loop 30529 1726882692.47227: getting the next task for host managed_node1 30529 1726882692.47238: done getting next task for host managed_node1 30529 1726882692.47242: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882692.47248: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.47269: getting variables 30529 1726882692.47271: in VariableManager get_vars() 30529 1726882692.47317: Calling all_inventory to load vars for managed_node1 30529 1726882692.47320: Calling groups_inventory to load vars for managed_node1 30529 1726882692.47322: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.47331: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.47334: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.47342: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.48126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.49005: done with get_vars() 30529 1726882692.49022: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:12 -0400 (0:00:00.057) 0:01:46.517 ****** 30529 1726882692.49091: entering _queue_task() for managed_node1/stat 30529 1726882692.49343: worker is 1 (out of 1 available) 30529 1726882692.49357: exiting _queue_task() for managed_node1/stat 30529 1726882692.49371: done queuing things up, now waiting for results queue to drain 30529 1726882692.49373: waiting for pending results... 30529 1726882692.49570: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882692.49677: in run() - task 12673a56-9f93-b0f1-edc0-000000002381 30529 1726882692.49689: variable 'ansible_search_path' from source: unknown 30529 1726882692.49695: variable 'ansible_search_path' from source: unknown 30529 1726882692.49725: calling self._execute() 30529 1726882692.49796: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.49800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.49809: variable 'omit' from source: magic vars 30529 1726882692.50084: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.50097: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.50211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882692.50406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882692.50440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882692.50466: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882692.50492: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882692.50556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882692.50573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882692.50598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882692.50616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882692.50799: variable '__network_is_ostree' from source: set_fact 30529 1726882692.50803: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882692.50805: when evaluation is False, skipping this task 30529 1726882692.50807: _execute() done 30529 1726882692.50809: dumping result to json 30529 1726882692.50810: done dumping result, returning 30529 1726882692.50812: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-000000002381] 30529 1726882692.50814: sending task result for task 12673a56-9f93-b0f1-edc0-000000002381 30529 1726882692.50873: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002381 30529 1726882692.50876: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882692.50925: no more pending results, returning what we have 30529 1726882692.50928: results queue empty 30529 1726882692.50929: checking for any_errors_fatal 30529 1726882692.50936: done checking for any_errors_fatal 30529 1726882692.50936: checking for max_fail_percentage 30529 1726882692.50938: done checking for max_fail_percentage 30529 1726882692.50939: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.50939: done checking to see if all hosts have failed 30529 1726882692.50940: getting the remaining hosts for this loop 30529 1726882692.50942: done getting the remaining hosts for this loop 30529 1726882692.50945: getting the next task for host managed_node1 30529 1726882692.50952: done getting next task for host managed_node1 30529 1726882692.50955: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882692.50961: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.50979: getting variables 30529 1726882692.50981: in VariableManager get_vars() 30529 1726882692.51018: Calling all_inventory to load vars for managed_node1 30529 1726882692.51020: Calling groups_inventory to load vars for managed_node1 30529 1726882692.51022: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.51030: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.51033: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.51036: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.52396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.53462: done with get_vars() 30529 1726882692.53477: done getting variables 30529 1726882692.53517: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:12 -0400 (0:00:00.044) 0:01:46.561 ****** 30529 1726882692.53544: entering _queue_task() for managed_node1/set_fact 30529 1726882692.53758: worker is 1 (out of 1 available) 30529 1726882692.53771: exiting _queue_task() for managed_node1/set_fact 30529 1726882692.53785: done queuing things up, now waiting for results queue to drain 30529 1726882692.53787: waiting for pending results... 30529 1726882692.53958: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882692.54069: in run() - task 12673a56-9f93-b0f1-edc0-000000002382 30529 1726882692.54081: variable 'ansible_search_path' from source: unknown 30529 1726882692.54085: variable 'ansible_search_path' from source: unknown 30529 1726882692.54117: calling self._execute() 30529 1726882692.54180: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.54184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.54197: variable 'omit' from source: magic vars 30529 1726882692.54461: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.54471: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.54580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882692.54768: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882692.54805: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882692.54830: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882692.54855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882692.54920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882692.54938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882692.54956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882692.54973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882692.55041: variable '__network_is_ostree' from source: set_fact 30529 1726882692.55047: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882692.55050: when evaluation is False, skipping this task 30529 1726882692.55052: _execute() done 30529 1726882692.55055: dumping result to json 30529 1726882692.55059: done dumping result, returning 30529 1726882692.55067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-000000002382] 30529 1726882692.55071: sending task result for task 12673a56-9f93-b0f1-edc0-000000002382 30529 1726882692.55154: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002382 30529 1726882692.55157: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882692.55201: no more pending results, returning what we have 30529 1726882692.55205: results queue empty 30529 1726882692.55206: checking for any_errors_fatal 30529 1726882692.55214: done checking for any_errors_fatal 30529 1726882692.55215: checking for max_fail_percentage 30529 1726882692.55217: done checking for max_fail_percentage 30529 1726882692.55218: checking to see if all hosts have failed and the running result is not ok 30529 1726882692.55219: done checking to see if all hosts have failed 30529 1726882692.55219: getting the remaining hosts for this loop 30529 1726882692.55221: done getting the remaining hosts for this loop 30529 1726882692.55224: getting the next task for host managed_node1 30529 1726882692.55235: done getting next task for host managed_node1 30529 1726882692.55239: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882692.55244: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882692.55263: getting variables 30529 1726882692.55264: in VariableManager get_vars() 30529 1726882692.55303: Calling all_inventory to load vars for managed_node1 30529 1726882692.55305: Calling groups_inventory to load vars for managed_node1 30529 1726882692.55308: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882692.55315: Calling all_plugins_play to load vars for managed_node1 30529 1726882692.55318: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882692.55320: Calling groups_plugins_play to load vars for managed_node1 30529 1726882692.56057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882692.56914: done with get_vars() 30529 1726882692.56928: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:12 -0400 (0:00:00.034) 0:01:46.596 ****** 30529 1726882692.56992: entering _queue_task() for managed_node1/service_facts 30529 1726882692.57203: worker is 1 (out of 1 available) 30529 1726882692.57216: exiting _queue_task() for managed_node1/service_facts 30529 1726882692.57230: done queuing things up, now waiting for results queue to drain 30529 1726882692.57231: waiting for pending results... 30529 1726882692.57415: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882692.57511: in run() - task 12673a56-9f93-b0f1-edc0-000000002384 30529 1726882692.57524: variable 'ansible_search_path' from source: unknown 30529 1726882692.57529: variable 'ansible_search_path' from source: unknown 30529 1726882692.57553: calling self._execute() 30529 1726882692.57628: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.57632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.57641: variable 'omit' from source: magic vars 30529 1726882692.57904: variable 'ansible_distribution_major_version' from source: facts 30529 1726882692.57913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882692.57920: variable 'omit' from source: magic vars 30529 1726882692.57975: variable 'omit' from source: magic vars 30529 1726882692.58002: variable 'omit' from source: magic vars 30529 1726882692.58034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882692.58059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882692.58076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882692.58089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882692.58105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882692.58128: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882692.58131: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.58133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.58203: Set connection var ansible_shell_executable to /bin/sh 30529 1726882692.58207: Set connection var ansible_pipelining to False 30529 1726882692.58210: Set connection var ansible_shell_type to sh 30529 1726882692.58220: Set connection var ansible_timeout to 10 30529 1726882692.58223: Set connection var ansible_connection to ssh 30529 1726882692.58225: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882692.58242: variable 'ansible_shell_executable' from source: unknown 30529 1726882692.58245: variable 'ansible_connection' from source: unknown 30529 1726882692.58248: variable 'ansible_module_compression' from source: unknown 30529 1726882692.58250: variable 'ansible_shell_type' from source: unknown 30529 1726882692.58253: variable 'ansible_shell_executable' from source: unknown 30529 1726882692.58255: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882692.58257: variable 'ansible_pipelining' from source: unknown 30529 1726882692.58259: variable 'ansible_timeout' from source: unknown 30529 1726882692.58262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882692.58402: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882692.58411: variable 'omit' from source: magic vars 30529 1726882692.58416: starting attempt loop 30529 1726882692.58419: running the handler 30529 1726882692.58432: _low_level_execute_command(): starting 30529 1726882692.58440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882692.58946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.58950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.58953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.58955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.59008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882692.59011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.59067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.60745: stdout chunk (state=3): >>>/root <<< 30529 1726882692.60847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882692.60875: stderr chunk (state=3): >>><<< 30529 1726882692.60879: stdout chunk (state=3): >>><<< 30529 1726882692.60900: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882692.60914: _low_level_execute_command(): starting 30529 1726882692.60918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074 `" && echo ansible-tmp-1726882692.6089966-35523-64259501733074="` echo /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074 `" ) && sleep 0' 30529 1726882692.61363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.61367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882692.61369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882692.61379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882692.61382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.61424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882692.61428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882692.61434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.61473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.63311: stdout chunk (state=3): >>>ansible-tmp-1726882692.6089966-35523-64259501733074=/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074 <<< 30529 1726882692.63422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882692.63447: stderr chunk (state=3): >>><<< 30529 1726882692.63450: stdout chunk (state=3): >>><<< 30529 1726882692.63463: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882692.6089966-35523-64259501733074=/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882692.63508: variable 'ansible_module_compression' from source: unknown 30529 1726882692.63540: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882692.63573: variable 'ansible_facts' from source: unknown 30529 1726882692.63630: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py 30529 1726882692.63734: Sending initial data 30529 1726882692.63737: Sent initial data (161 bytes) 30529 1726882692.64153: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.64162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882692.64188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.64192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.64247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882692.64254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882692.64256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.64299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.65804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882692.65843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882692.65891: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj13t7ez1 /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py <<< 30529 1726882692.65895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py" <<< 30529 1726882692.65928: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpj13t7ez1" to remote "/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py" <<< 30529 1726882692.66471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882692.66512: stderr chunk (state=3): >>><<< 30529 1726882692.66515: stdout chunk (state=3): >>><<< 30529 1726882692.66581: done transferring module to remote 30529 1726882692.66592: _low_level_execute_command(): starting 30529 1726882692.66597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/ /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py && sleep 0' 30529 1726882692.67036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882692.67039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882692.67042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882692.67044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882692.67046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882692.67100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882692.67108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.67158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882692.68858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882692.68879: stderr chunk (state=3): >>><<< 30529 1726882692.68882: stdout chunk (state=3): >>><<< 30529 1726882692.68897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882692.68900: _low_level_execute_command(): starting 30529 1726882692.68904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/AnsiballZ_service_facts.py && sleep 0' 30529 1726882692.69411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882692.69426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882692.69485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.20871: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30529 1726882694.20889: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30529 1726882694.20902: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30529 1726882694.20906: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30529 1726882694.20923: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882694.22427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882694.22453: stderr chunk (state=3): >>><<< 30529 1726882694.22456: stdout chunk (state=3): >>><<< 30529 1726882694.22490: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882694.23203: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882694.23211: _low_level_execute_command(): starting 30529 1726882694.23216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882692.6089966-35523-64259501733074/ > /dev/null 2>&1 && sleep 0' 30529 1726882694.23844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882694.23860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.23945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882694.23962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882694.23978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.24036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.25797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.25818: stderr chunk (state=3): >>><<< 30529 1726882694.25821: stdout chunk (state=3): >>><<< 30529 1726882694.25834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882694.25839: handler run complete 30529 1726882694.25951: variable 'ansible_facts' from source: unknown 30529 1726882694.26044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.26324: variable 'ansible_facts' from source: unknown 30529 1726882694.26401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.26515: attempt loop complete, returning result 30529 1726882694.26520: _execute() done 30529 1726882694.26523: dumping result to json 30529 1726882694.26561: done dumping result, returning 30529 1726882694.26568: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000002384] 30529 1726882694.26571: sending task result for task 12673a56-9f93-b0f1-edc0-000000002384 30529 1726882694.27331: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002384 30529 1726882694.27334: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882694.27386: no more pending results, returning what we have 30529 1726882694.27388: results queue empty 30529 1726882694.27389: checking for any_errors_fatal 30529 1726882694.27391: done checking for any_errors_fatal 30529 1726882694.27392: checking for max_fail_percentage 30529 1726882694.27395: done checking for max_fail_percentage 30529 1726882694.27396: checking to see if all hosts have failed and the running result is not ok 30529 1726882694.27397: done checking to see if all hosts have failed 30529 1726882694.27397: getting the remaining hosts for this loop 30529 1726882694.27398: done getting the remaining hosts for this loop 30529 1726882694.27400: getting the next task for host managed_node1 30529 1726882694.27405: done getting next task for host managed_node1 30529 1726882694.27407: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882694.27412: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882694.27422: getting variables 30529 1726882694.27423: in VariableManager get_vars() 30529 1726882694.27446: Calling all_inventory to load vars for managed_node1 30529 1726882694.27448: Calling groups_inventory to load vars for managed_node1 30529 1726882694.27450: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882694.27456: Calling all_plugins_play to load vars for managed_node1 30529 1726882694.27458: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882694.27459: Calling groups_plugins_play to load vars for managed_node1 30529 1726882694.28138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.29086: done with get_vars() 30529 1726882694.29104: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:14 -0400 (0:00:01.721) 0:01:48.317 ****** 30529 1726882694.29174: entering _queue_task() for managed_node1/package_facts 30529 1726882694.29409: worker is 1 (out of 1 available) 30529 1726882694.29423: exiting _queue_task() for managed_node1/package_facts 30529 1726882694.29436: done queuing things up, now waiting for results queue to drain 30529 1726882694.29437: waiting for pending results... 30529 1726882694.29626: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882694.29739: in run() - task 12673a56-9f93-b0f1-edc0-000000002385 30529 1726882694.29752: variable 'ansible_search_path' from source: unknown 30529 1726882694.29756: variable 'ansible_search_path' from source: unknown 30529 1726882694.29784: calling self._execute() 30529 1726882694.29856: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.29860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.29868: variable 'omit' from source: magic vars 30529 1726882694.30147: variable 'ansible_distribution_major_version' from source: facts 30529 1726882694.30157: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882694.30163: variable 'omit' from source: magic vars 30529 1726882694.30222: variable 'omit' from source: magic vars 30529 1726882694.30244: variable 'omit' from source: magic vars 30529 1726882694.30275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882694.30306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882694.30325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882694.30339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882694.30350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882694.30372: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882694.30375: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.30377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.30454: Set connection var ansible_shell_executable to /bin/sh 30529 1726882694.30457: Set connection var ansible_pipelining to False 30529 1726882694.30460: Set connection var ansible_shell_type to sh 30529 1726882694.30467: Set connection var ansible_timeout to 10 30529 1726882694.30471: Set connection var ansible_connection to ssh 30529 1726882694.30475: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882694.30495: variable 'ansible_shell_executable' from source: unknown 30529 1726882694.30499: variable 'ansible_connection' from source: unknown 30529 1726882694.30502: variable 'ansible_module_compression' from source: unknown 30529 1726882694.30505: variable 'ansible_shell_type' from source: unknown 30529 1726882694.30507: variable 'ansible_shell_executable' from source: unknown 30529 1726882694.30509: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.30511: variable 'ansible_pipelining' from source: unknown 30529 1726882694.30514: variable 'ansible_timeout' from source: unknown 30529 1726882694.30518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.30659: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882694.30668: variable 'omit' from source: magic vars 30529 1726882694.30672: starting attempt loop 30529 1726882694.30675: running the handler 30529 1726882694.30687: _low_level_execute_command(): starting 30529 1726882694.30699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882694.31201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.31204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.31207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.31209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.31261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882694.31268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882694.31270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.31311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.32864: stdout chunk (state=3): >>>/root <<< 30529 1726882694.32960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.32985: stderr chunk (state=3): >>><<< 30529 1726882694.32991: stdout chunk (state=3): >>><<< 30529 1726882694.33010: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882694.33018: _low_level_execute_command(): starting 30529 1726882694.33024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359 `" && echo ansible-tmp-1726882694.3300738-35567-118863337523359="` echo /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359 `" ) && sleep 0' 30529 1726882694.33447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.33451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882694.33453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882694.33462: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.33464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.33502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882694.33514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.33562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.35399: stdout chunk (state=3): >>>ansible-tmp-1726882694.3300738-35567-118863337523359=/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359 <<< 30529 1726882694.35506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.35532: stderr chunk (state=3): >>><<< 30529 1726882694.35535: stdout chunk (state=3): >>><<< 30529 1726882694.35547: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882694.3300738-35567-118863337523359=/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882694.35582: variable 'ansible_module_compression' from source: unknown 30529 1726882694.35622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882694.35675: variable 'ansible_facts' from source: unknown 30529 1726882694.35796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py 30529 1726882694.35896: Sending initial data 30529 1726882694.35899: Sent initial data (162 bytes) 30529 1726882694.36333: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882694.36336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882694.36339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.36341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882694.36343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882694.36345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.36387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882694.36403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.36441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.37929: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882694.37937: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882694.37968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882694.38021: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5hp7e64x /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py <<< 30529 1726882694.38024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py" <<< 30529 1726882694.38058: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp5hp7e64x" to remote "/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py" <<< 30529 1726882694.39076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.39118: stderr chunk (state=3): >>><<< 30529 1726882694.39122: stdout chunk (state=3): >>><<< 30529 1726882694.39136: done transferring module to remote 30529 1726882694.39144: _low_level_execute_command(): starting 30529 1726882694.39147: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/ /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py && sleep 0' 30529 1726882694.39566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.39570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882694.39572: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882694.39578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.39625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882694.39628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.39676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.41424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.41427: stdout chunk (state=3): >>><<< 30529 1726882694.41430: stderr chunk (state=3): >>><<< 30529 1726882694.41441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882694.41449: _low_level_execute_command(): starting 30529 1726882694.41518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/AnsiballZ_package_facts.py && sleep 0' 30529 1726882694.41955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.41969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882694.41979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.42031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882694.42048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.42089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.85585: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882694.85603: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30529 1726882694.85647: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30529 1726882694.85660: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30529 1726882694.85690: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30529 1726882694.85717: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882694.85721: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882694.85758: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30529 1726882694.85763: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882694.85797: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30529 1726882694.85805: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882694.87458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882694.87491: stderr chunk (state=3): >>><<< 30529 1726882694.87496: stdout chunk (state=3): >>><<< 30529 1726882694.87537: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882694.88780: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882694.88798: _low_level_execute_command(): starting 30529 1726882694.88803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882694.3300738-35567-118863337523359/ > /dev/null 2>&1 && sleep 0' 30529 1726882694.89252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882694.89256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882694.89258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.89260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882694.89262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882694.89318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882694.89325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882694.89364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882694.91175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882694.91204: stderr chunk (state=3): >>><<< 30529 1726882694.91208: stdout chunk (state=3): >>><<< 30529 1726882694.91221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882694.91226: handler run complete 30529 1726882694.91753: variable 'ansible_facts' from source: unknown 30529 1726882694.92032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.93080: variable 'ansible_facts' from source: unknown 30529 1726882694.93329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.93706: attempt loop complete, returning result 30529 1726882694.93718: _execute() done 30529 1726882694.93721: dumping result to json 30529 1726882694.93835: done dumping result, returning 30529 1726882694.93843: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000002385] 30529 1726882694.93848: sending task result for task 12673a56-9f93-b0f1-edc0-000000002385 30529 1726882694.95274: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002385 30529 1726882694.95278: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882694.95372: no more pending results, returning what we have 30529 1726882694.95374: results queue empty 30529 1726882694.95375: checking for any_errors_fatal 30529 1726882694.95380: done checking for any_errors_fatal 30529 1726882694.95381: checking for max_fail_percentage 30529 1726882694.95382: done checking for max_fail_percentage 30529 1726882694.95382: checking to see if all hosts have failed and the running result is not ok 30529 1726882694.95383: done checking to see if all hosts have failed 30529 1726882694.95383: getting the remaining hosts for this loop 30529 1726882694.95384: done getting the remaining hosts for this loop 30529 1726882694.95387: getting the next task for host managed_node1 30529 1726882694.95395: done getting next task for host managed_node1 30529 1726882694.95397: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882694.95401: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882694.95410: getting variables 30529 1726882694.95411: in VariableManager get_vars() 30529 1726882694.95437: Calling all_inventory to load vars for managed_node1 30529 1726882694.95438: Calling groups_inventory to load vars for managed_node1 30529 1726882694.95440: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882694.95446: Calling all_plugins_play to load vars for managed_node1 30529 1726882694.95448: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882694.95450: Calling groups_plugins_play to load vars for managed_node1 30529 1726882694.96152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882694.97019: done with get_vars() 30529 1726882694.97035: done getting variables 30529 1726882694.97080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:14 -0400 (0:00:00.679) 0:01:48.997 ****** 30529 1726882694.97109: entering _queue_task() for managed_node1/debug 30529 1726882694.97364: worker is 1 (out of 1 available) 30529 1726882694.97378: exiting _queue_task() for managed_node1/debug 30529 1726882694.97391: done queuing things up, now waiting for results queue to drain 30529 1726882694.97394: waiting for pending results... 30529 1726882694.97584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882694.97685: in run() - task 12673a56-9f93-b0f1-edc0-000000002329 30529 1726882694.97702: variable 'ansible_search_path' from source: unknown 30529 1726882694.97705: variable 'ansible_search_path' from source: unknown 30529 1726882694.97737: calling self._execute() 30529 1726882694.97812: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.97816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.97824: variable 'omit' from source: magic vars 30529 1726882694.98104: variable 'ansible_distribution_major_version' from source: facts 30529 1726882694.98113: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882694.98120: variable 'omit' from source: magic vars 30529 1726882694.98160: variable 'omit' from source: magic vars 30529 1726882694.98230: variable 'network_provider' from source: set_fact 30529 1726882694.98243: variable 'omit' from source: magic vars 30529 1726882694.98274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882694.98311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882694.98327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882694.98340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882694.98350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882694.98373: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882694.98377: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.98379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.98454: Set connection var ansible_shell_executable to /bin/sh 30529 1726882694.98458: Set connection var ansible_pipelining to False 30529 1726882694.98460: Set connection var ansible_shell_type to sh 30529 1726882694.98469: Set connection var ansible_timeout to 10 30529 1726882694.98471: Set connection var ansible_connection to ssh 30529 1726882694.98476: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882694.98496: variable 'ansible_shell_executable' from source: unknown 30529 1726882694.98500: variable 'ansible_connection' from source: unknown 30529 1726882694.98504: variable 'ansible_module_compression' from source: unknown 30529 1726882694.98506: variable 'ansible_shell_type' from source: unknown 30529 1726882694.98508: variable 'ansible_shell_executable' from source: unknown 30529 1726882694.98511: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882694.98513: variable 'ansible_pipelining' from source: unknown 30529 1726882694.98515: variable 'ansible_timeout' from source: unknown 30529 1726882694.98517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882694.98614: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882694.98623: variable 'omit' from source: magic vars 30529 1726882694.98634: starting attempt loop 30529 1726882694.98638: running the handler 30529 1726882694.98669: handler run complete 30529 1726882694.98679: attempt loop complete, returning result 30529 1726882694.98682: _execute() done 30529 1726882694.98685: dumping result to json 30529 1726882694.98690: done dumping result, returning 30529 1726882694.98695: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000002329] 30529 1726882694.98700: sending task result for task 12673a56-9f93-b0f1-edc0-000000002329 30529 1726882694.98779: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002329 30529 1726882694.98782: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882694.98854: no more pending results, returning what we have 30529 1726882694.98857: results queue empty 30529 1726882694.98858: checking for any_errors_fatal 30529 1726882694.98867: done checking for any_errors_fatal 30529 1726882694.98868: checking for max_fail_percentage 30529 1726882694.98869: done checking for max_fail_percentage 30529 1726882694.98870: checking to see if all hosts have failed and the running result is not ok 30529 1726882694.98870: done checking to see if all hosts have failed 30529 1726882694.98871: getting the remaining hosts for this loop 30529 1726882694.98873: done getting the remaining hosts for this loop 30529 1726882694.98876: getting the next task for host managed_node1 30529 1726882694.98884: done getting next task for host managed_node1 30529 1726882694.98895: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882694.98899: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882694.98910: getting variables 30529 1726882694.98912: in VariableManager get_vars() 30529 1726882694.98949: Calling all_inventory to load vars for managed_node1 30529 1726882694.98951: Calling groups_inventory to load vars for managed_node1 30529 1726882694.98953: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882694.98961: Calling all_plugins_play to load vars for managed_node1 30529 1726882694.98964: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882694.98966: Calling groups_plugins_play to load vars for managed_node1 30529 1726882694.99811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.00682: done with get_vars() 30529 1726882695.00701: done getting variables 30529 1726882695.00742: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:15 -0400 (0:00:00.036) 0:01:49.033 ****** 30529 1726882695.00771: entering _queue_task() for managed_node1/fail 30529 1726882695.00985: worker is 1 (out of 1 available) 30529 1726882695.01003: exiting _queue_task() for managed_node1/fail 30529 1726882695.01014: done queuing things up, now waiting for results queue to drain 30529 1726882695.01016: waiting for pending results... 30529 1726882695.01196: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882695.01272: in run() - task 12673a56-9f93-b0f1-edc0-00000000232a 30529 1726882695.01284: variable 'ansible_search_path' from source: unknown 30529 1726882695.01291: variable 'ansible_search_path' from source: unknown 30529 1726882695.01316: calling self._execute() 30529 1726882695.01386: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.01394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.01400: variable 'omit' from source: magic vars 30529 1726882695.01666: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.01681: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.01760: variable 'network_state' from source: role '' defaults 30529 1726882695.01769: Evaluated conditional (network_state != {}): False 30529 1726882695.01772: when evaluation is False, skipping this task 30529 1726882695.01775: _execute() done 30529 1726882695.01778: dumping result to json 30529 1726882695.01780: done dumping result, returning 30529 1726882695.01800: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-00000000232a] 30529 1726882695.01804: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232a 30529 1726882695.01882: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232a 30529 1726882695.01885: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882695.01939: no more pending results, returning what we have 30529 1726882695.01942: results queue empty 30529 1726882695.01943: checking for any_errors_fatal 30529 1726882695.01949: done checking for any_errors_fatal 30529 1726882695.01950: checking for max_fail_percentage 30529 1726882695.01951: done checking for max_fail_percentage 30529 1726882695.01952: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.01953: done checking to see if all hosts have failed 30529 1726882695.01954: getting the remaining hosts for this loop 30529 1726882695.01955: done getting the remaining hosts for this loop 30529 1726882695.01958: getting the next task for host managed_node1 30529 1726882695.01965: done getting next task for host managed_node1 30529 1726882695.01968: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882695.01972: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.01996: getting variables 30529 1726882695.01998: in VariableManager get_vars() 30529 1726882695.02032: Calling all_inventory to load vars for managed_node1 30529 1726882695.02035: Calling groups_inventory to load vars for managed_node1 30529 1726882695.02037: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.02044: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.02047: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.02049: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.02792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.03766: done with get_vars() 30529 1726882695.03781: done getting variables 30529 1726882695.03825: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:15 -0400 (0:00:00.030) 0:01:49.064 ****** 30529 1726882695.03848: entering _queue_task() for managed_node1/fail 30529 1726882695.04055: worker is 1 (out of 1 available) 30529 1726882695.04069: exiting _queue_task() for managed_node1/fail 30529 1726882695.04082: done queuing things up, now waiting for results queue to drain 30529 1726882695.04083: waiting for pending results... 30529 1726882695.04256: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882695.04352: in run() - task 12673a56-9f93-b0f1-edc0-00000000232b 30529 1726882695.04363: variable 'ansible_search_path' from source: unknown 30529 1726882695.04366: variable 'ansible_search_path' from source: unknown 30529 1726882695.04394: calling self._execute() 30529 1726882695.04464: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.04467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.04475: variable 'omit' from source: magic vars 30529 1726882695.04732: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.04746: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.04826: variable 'network_state' from source: role '' defaults 30529 1726882695.04833: Evaluated conditional (network_state != {}): False 30529 1726882695.04836: when evaluation is False, skipping this task 30529 1726882695.04839: _execute() done 30529 1726882695.04842: dumping result to json 30529 1726882695.04845: done dumping result, returning 30529 1726882695.04854: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-00000000232b] 30529 1726882695.04859: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232b 30529 1726882695.04948: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232b 30529 1726882695.04951: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882695.05011: no more pending results, returning what we have 30529 1726882695.05014: results queue empty 30529 1726882695.05015: checking for any_errors_fatal 30529 1726882695.05021: done checking for any_errors_fatal 30529 1726882695.05022: checking for max_fail_percentage 30529 1726882695.05023: done checking for max_fail_percentage 30529 1726882695.05024: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.05025: done checking to see if all hosts have failed 30529 1726882695.05026: getting the remaining hosts for this loop 30529 1726882695.05027: done getting the remaining hosts for this loop 30529 1726882695.05030: getting the next task for host managed_node1 30529 1726882695.05037: done getting next task for host managed_node1 30529 1726882695.05041: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882695.05045: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.05064: getting variables 30529 1726882695.05065: in VariableManager get_vars() 30529 1726882695.05109: Calling all_inventory to load vars for managed_node1 30529 1726882695.05111: Calling groups_inventory to load vars for managed_node1 30529 1726882695.05113: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.05119: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.05121: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.05122: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.09797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.10639: done with get_vars() 30529 1726882695.10658: done getting variables 30529 1726882695.10692: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:15 -0400 (0:00:00.068) 0:01:49.133 ****** 30529 1726882695.10715: entering _queue_task() for managed_node1/fail 30529 1726882695.10989: worker is 1 (out of 1 available) 30529 1726882695.11004: exiting _queue_task() for managed_node1/fail 30529 1726882695.11017: done queuing things up, now waiting for results queue to drain 30529 1726882695.11020: waiting for pending results... 30529 1726882695.11215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882695.11317: in run() - task 12673a56-9f93-b0f1-edc0-00000000232c 30529 1726882695.11328: variable 'ansible_search_path' from source: unknown 30529 1726882695.11333: variable 'ansible_search_path' from source: unknown 30529 1726882695.11364: calling self._execute() 30529 1726882695.11445: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.11451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.11462: variable 'omit' from source: magic vars 30529 1726882695.11754: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.11763: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.11883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.13443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.13498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.13530: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.13555: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.13574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.13640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.13658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.13675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.13704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.13715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.13783: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.13797: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882695.13874: variable 'ansible_distribution' from source: facts 30529 1726882695.13878: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.13885: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882695.14039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.14056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.14076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.14105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.14116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.14148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.14164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.14182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.14211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.14222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.14250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.14265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.14282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.14312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.14322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.14516: variable 'network_connections' from source: include params 30529 1726882695.14524: variable 'interface' from source: play vars 30529 1726882695.14569: variable 'interface' from source: play vars 30529 1726882695.14577: variable 'network_state' from source: role '' defaults 30529 1726882695.14628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.14744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.14772: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.14796: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.14823: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.14855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.14871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.14895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.14913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.14931: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882695.14935: when evaluation is False, skipping this task 30529 1726882695.14939: _execute() done 30529 1726882695.14943: dumping result to json 30529 1726882695.14945: done dumping result, returning 30529 1726882695.14956: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-00000000232c] 30529 1726882695.14959: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232c 30529 1726882695.15039: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232c 30529 1726882695.15042: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882695.15103: no more pending results, returning what we have 30529 1726882695.15106: results queue empty 30529 1726882695.15107: checking for any_errors_fatal 30529 1726882695.15119: done checking for any_errors_fatal 30529 1726882695.15119: checking for max_fail_percentage 30529 1726882695.15121: done checking for max_fail_percentage 30529 1726882695.15122: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.15123: done checking to see if all hosts have failed 30529 1726882695.15123: getting the remaining hosts for this loop 30529 1726882695.15125: done getting the remaining hosts for this loop 30529 1726882695.15129: getting the next task for host managed_node1 30529 1726882695.15142: done getting next task for host managed_node1 30529 1726882695.15146: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882695.15151: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.15174: getting variables 30529 1726882695.15176: in VariableManager get_vars() 30529 1726882695.15222: Calling all_inventory to load vars for managed_node1 30529 1726882695.15224: Calling groups_inventory to load vars for managed_node1 30529 1726882695.15226: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.15235: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.15238: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.15240: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.16097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.17084: done with get_vars() 30529 1726882695.17104: done getting variables 30529 1726882695.17148: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:15 -0400 (0:00:00.064) 0:01:49.197 ****** 30529 1726882695.17173: entering _queue_task() for managed_node1/dnf 30529 1726882695.17431: worker is 1 (out of 1 available) 30529 1726882695.17445: exiting _queue_task() for managed_node1/dnf 30529 1726882695.17458: done queuing things up, now waiting for results queue to drain 30529 1726882695.17459: waiting for pending results... 30529 1726882695.17646: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882695.17757: in run() - task 12673a56-9f93-b0f1-edc0-00000000232d 30529 1726882695.17769: variable 'ansible_search_path' from source: unknown 30529 1726882695.17772: variable 'ansible_search_path' from source: unknown 30529 1726882695.17901: calling self._execute() 30529 1726882695.17906: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.17910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.17912: variable 'omit' from source: magic vars 30529 1726882695.18160: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.18170: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.18305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.19830: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.19890: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.19915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.19940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.19959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.20021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.20043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.20061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.20086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.20102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.20181: variable 'ansible_distribution' from source: facts 30529 1726882695.20185: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.20199: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882695.20275: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.20361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.20378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.20397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.20425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.20439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.20465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.20481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.20500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.20525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.20541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.20566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.20581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.20600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.20625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.20636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.20735: variable 'network_connections' from source: include params 30529 1726882695.20744: variable 'interface' from source: play vars 30529 1726882695.20796: variable 'interface' from source: play vars 30529 1726882695.20842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.20962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.20995: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.21018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.21040: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.21071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.21094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.21114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.21132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.21166: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882695.21323: variable 'network_connections' from source: include params 30529 1726882695.21327: variable 'interface' from source: play vars 30529 1726882695.21370: variable 'interface' from source: play vars 30529 1726882695.21392: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882695.21397: when evaluation is False, skipping this task 30529 1726882695.21400: _execute() done 30529 1726882695.21411: dumping result to json 30529 1726882695.21414: done dumping result, returning 30529 1726882695.21416: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000232d] 30529 1726882695.21418: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232d 30529 1726882695.21507: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232d 30529 1726882695.21511: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882695.21574: no more pending results, returning what we have 30529 1726882695.21577: results queue empty 30529 1726882695.21579: checking for any_errors_fatal 30529 1726882695.21586: done checking for any_errors_fatal 30529 1726882695.21586: checking for max_fail_percentage 30529 1726882695.21590: done checking for max_fail_percentage 30529 1726882695.21591: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.21592: done checking to see if all hosts have failed 30529 1726882695.21595: getting the remaining hosts for this loop 30529 1726882695.21597: done getting the remaining hosts for this loop 30529 1726882695.21601: getting the next task for host managed_node1 30529 1726882695.21610: done getting next task for host managed_node1 30529 1726882695.21615: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882695.21621: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.21645: getting variables 30529 1726882695.21646: in VariableManager get_vars() 30529 1726882695.21691: Calling all_inventory to load vars for managed_node1 30529 1726882695.21700: Calling groups_inventory to load vars for managed_node1 30529 1726882695.21703: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.21712: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.21715: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.21717: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.22550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.23435: done with get_vars() 30529 1726882695.23451: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882695.23507: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:15 -0400 (0:00:00.063) 0:01:49.261 ****** 30529 1726882695.23529: entering _queue_task() for managed_node1/yum 30529 1726882695.23775: worker is 1 (out of 1 available) 30529 1726882695.23792: exiting _queue_task() for managed_node1/yum 30529 1726882695.23807: done queuing things up, now waiting for results queue to drain 30529 1726882695.23808: waiting for pending results... 30529 1726882695.23984: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882695.24094: in run() - task 12673a56-9f93-b0f1-edc0-00000000232e 30529 1726882695.24109: variable 'ansible_search_path' from source: unknown 30529 1726882695.24112: variable 'ansible_search_path' from source: unknown 30529 1726882695.24150: calling self._execute() 30529 1726882695.24220: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.24224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.24233: variable 'omit' from source: magic vars 30529 1726882695.24525: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.24534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.24659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.26212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.26560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.26588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.26615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.26635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.26699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.26719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.26736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.26766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.26778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.26848: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.26863: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882695.26867: when evaluation is False, skipping this task 30529 1726882695.26869: _execute() done 30529 1726882695.26872: dumping result to json 30529 1726882695.26874: done dumping result, returning 30529 1726882695.26883: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000232e] 30529 1726882695.26886: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232e 30529 1726882695.26979: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232e 30529 1726882695.26983: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882695.27035: no more pending results, returning what we have 30529 1726882695.27038: results queue empty 30529 1726882695.27039: checking for any_errors_fatal 30529 1726882695.27045: done checking for any_errors_fatal 30529 1726882695.27046: checking for max_fail_percentage 30529 1726882695.27048: done checking for max_fail_percentage 30529 1726882695.27049: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.27050: done checking to see if all hosts have failed 30529 1726882695.27051: getting the remaining hosts for this loop 30529 1726882695.27052: done getting the remaining hosts for this loop 30529 1726882695.27056: getting the next task for host managed_node1 30529 1726882695.27065: done getting next task for host managed_node1 30529 1726882695.27068: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882695.27073: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.27105: getting variables 30529 1726882695.27107: in VariableManager get_vars() 30529 1726882695.27150: Calling all_inventory to load vars for managed_node1 30529 1726882695.27153: Calling groups_inventory to load vars for managed_node1 30529 1726882695.27155: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.27164: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.27167: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.27169: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.28147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.29002: done with get_vars() 30529 1726882695.29018: done getting variables 30529 1726882695.29063: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:15 -0400 (0:00:00.055) 0:01:49.316 ****** 30529 1726882695.29089: entering _queue_task() for managed_node1/fail 30529 1726882695.29340: worker is 1 (out of 1 available) 30529 1726882695.29355: exiting _queue_task() for managed_node1/fail 30529 1726882695.29367: done queuing things up, now waiting for results queue to drain 30529 1726882695.29368: waiting for pending results... 30529 1726882695.29557: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882695.29667: in run() - task 12673a56-9f93-b0f1-edc0-00000000232f 30529 1726882695.29678: variable 'ansible_search_path' from source: unknown 30529 1726882695.29682: variable 'ansible_search_path' from source: unknown 30529 1726882695.29721: calling self._execute() 30529 1726882695.29796: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.29800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.29811: variable 'omit' from source: magic vars 30529 1726882695.30092: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.30101: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.30191: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.30320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.31829: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.31882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.31914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.31941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.31960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.32024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.32044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.32063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.32092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.32105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.32137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.32153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.32169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.32196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.32213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.32236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.32252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.32267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.32294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.32304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.32424: variable 'network_connections' from source: include params 30529 1726882695.32435: variable 'interface' from source: play vars 30529 1726882695.32481: variable 'interface' from source: play vars 30529 1726882695.32532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.32641: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.32680: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.32705: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.32727: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.32762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.32775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.32794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.32812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.32850: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882695.33002: variable 'network_connections' from source: include params 30529 1726882695.33006: variable 'interface' from source: play vars 30529 1726882695.33047: variable 'interface' from source: play vars 30529 1726882695.33065: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882695.33069: when evaluation is False, skipping this task 30529 1726882695.33072: _execute() done 30529 1726882695.33074: dumping result to json 30529 1726882695.33076: done dumping result, returning 30529 1726882695.33092: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000232f] 30529 1726882695.33096: sending task result for task 12673a56-9f93-b0f1-edc0-00000000232f 30529 1726882695.33179: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000232f 30529 1726882695.33181: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882695.33262: no more pending results, returning what we have 30529 1726882695.33265: results queue empty 30529 1726882695.33266: checking for any_errors_fatal 30529 1726882695.33273: done checking for any_errors_fatal 30529 1726882695.33274: checking for max_fail_percentage 30529 1726882695.33275: done checking for max_fail_percentage 30529 1726882695.33276: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.33277: done checking to see if all hosts have failed 30529 1726882695.33278: getting the remaining hosts for this loop 30529 1726882695.33281: done getting the remaining hosts for this loop 30529 1726882695.33285: getting the next task for host managed_node1 30529 1726882695.33296: done getting next task for host managed_node1 30529 1726882695.33302: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882695.33306: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.33329: getting variables 30529 1726882695.33330: in VariableManager get_vars() 30529 1726882695.33371: Calling all_inventory to load vars for managed_node1 30529 1726882695.33374: Calling groups_inventory to load vars for managed_node1 30529 1726882695.33376: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.33385: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.33390: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.33392: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.34229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.35112: done with get_vars() 30529 1726882695.35130: done getting variables 30529 1726882695.35171: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:15 -0400 (0:00:00.061) 0:01:49.378 ****** 30529 1726882695.35201: entering _queue_task() for managed_node1/package 30529 1726882695.35448: worker is 1 (out of 1 available) 30529 1726882695.35462: exiting _queue_task() for managed_node1/package 30529 1726882695.35475: done queuing things up, now waiting for results queue to drain 30529 1726882695.35477: waiting for pending results... 30529 1726882695.35667: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882695.35771: in run() - task 12673a56-9f93-b0f1-edc0-000000002330 30529 1726882695.35783: variable 'ansible_search_path' from source: unknown 30529 1726882695.35786: variable 'ansible_search_path' from source: unknown 30529 1726882695.35823: calling self._execute() 30529 1726882695.35898: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.35901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.35910: variable 'omit' from source: magic vars 30529 1726882695.36188: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.36202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.36339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.36538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.36570: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.36603: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.36658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.36746: variable 'network_packages' from source: role '' defaults 30529 1726882695.36824: variable '__network_provider_setup' from source: role '' defaults 30529 1726882695.36834: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882695.36878: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882695.36885: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882695.36934: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882695.37051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.38681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.38727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.38759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.38781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.38806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.38866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.38888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.38909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.38935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.38945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.38978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.39000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.39016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.39040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.39050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.39202: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882695.39274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.39297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.39317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.39341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.39352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.39419: variable 'ansible_python' from source: facts 30529 1726882695.39430: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882695.39484: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882695.39545: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882695.39635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.39650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.39666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.39692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.39705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.39741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.39758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.39775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.39803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.39814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.39911: variable 'network_connections' from source: include params 30529 1726882695.39917: variable 'interface' from source: play vars 30529 1726882695.39987: variable 'interface' from source: play vars 30529 1726882695.40039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.40057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.40080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.40109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.40145: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.40325: variable 'network_connections' from source: include params 30529 1726882695.40328: variable 'interface' from source: play vars 30529 1726882695.40400: variable 'interface' from source: play vars 30529 1726882695.40422: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882695.40475: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.40670: variable 'network_connections' from source: include params 30529 1726882695.40673: variable 'interface' from source: play vars 30529 1726882695.40721: variable 'interface' from source: play vars 30529 1726882695.40739: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882695.40794: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882695.40983: variable 'network_connections' from source: include params 30529 1726882695.40990: variable 'interface' from source: play vars 30529 1726882695.41033: variable 'interface' from source: play vars 30529 1726882695.41072: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882695.41115: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882695.41121: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882695.41163: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882695.41298: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882695.41592: variable 'network_connections' from source: include params 30529 1726882695.41597: variable 'interface' from source: play vars 30529 1726882695.41636: variable 'interface' from source: play vars 30529 1726882695.41642: variable 'ansible_distribution' from source: facts 30529 1726882695.41645: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.41651: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.41661: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882695.41768: variable 'ansible_distribution' from source: facts 30529 1726882695.41771: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.41776: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.41790: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882695.41892: variable 'ansible_distribution' from source: facts 30529 1726882695.41904: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.41909: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.41937: variable 'network_provider' from source: set_fact 30529 1726882695.41949: variable 'ansible_facts' from source: unknown 30529 1726882695.42339: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882695.42343: when evaluation is False, skipping this task 30529 1726882695.42347: _execute() done 30529 1726882695.42349: dumping result to json 30529 1726882695.42351: done dumping result, returning 30529 1726882695.42363: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-000000002330] 30529 1726882695.42365: sending task result for task 12673a56-9f93-b0f1-edc0-000000002330 30529 1726882695.42453: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002330 30529 1726882695.42456: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882695.42518: no more pending results, returning what we have 30529 1726882695.42522: results queue empty 30529 1726882695.42523: checking for any_errors_fatal 30529 1726882695.42529: done checking for any_errors_fatal 30529 1726882695.42530: checking for max_fail_percentage 30529 1726882695.42531: done checking for max_fail_percentage 30529 1726882695.42532: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.42533: done checking to see if all hosts have failed 30529 1726882695.42533: getting the remaining hosts for this loop 30529 1726882695.42535: done getting the remaining hosts for this loop 30529 1726882695.42539: getting the next task for host managed_node1 30529 1726882695.42547: done getting next task for host managed_node1 30529 1726882695.42550: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882695.42555: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.42583: getting variables 30529 1726882695.42584: in VariableManager get_vars() 30529 1726882695.42637: Calling all_inventory to load vars for managed_node1 30529 1726882695.42639: Calling groups_inventory to load vars for managed_node1 30529 1726882695.42641: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.42651: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.42654: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.42656: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.43671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.44547: done with get_vars() 30529 1726882695.44563: done getting variables 30529 1726882695.44609: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:15 -0400 (0:00:00.094) 0:01:49.472 ****** 30529 1726882695.44634: entering _queue_task() for managed_node1/package 30529 1726882695.44886: worker is 1 (out of 1 available) 30529 1726882695.44902: exiting _queue_task() for managed_node1/package 30529 1726882695.44916: done queuing things up, now waiting for results queue to drain 30529 1726882695.44917: waiting for pending results... 30529 1726882695.45103: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882695.45206: in run() - task 12673a56-9f93-b0f1-edc0-000000002331 30529 1726882695.45227: variable 'ansible_search_path' from source: unknown 30529 1726882695.45232: variable 'ansible_search_path' from source: unknown 30529 1726882695.45255: calling self._execute() 30529 1726882695.45328: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.45339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.45342: variable 'omit' from source: magic vars 30529 1726882695.45618: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.45628: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.45714: variable 'network_state' from source: role '' defaults 30529 1726882695.45723: Evaluated conditional (network_state != {}): False 30529 1726882695.45726: when evaluation is False, skipping this task 30529 1726882695.45728: _execute() done 30529 1726882695.45731: dumping result to json 30529 1726882695.45734: done dumping result, returning 30529 1726882695.45740: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000002331] 30529 1726882695.45746: sending task result for task 12673a56-9f93-b0f1-edc0-000000002331 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882695.45890: no more pending results, returning what we have 30529 1726882695.45895: results queue empty 30529 1726882695.45896: checking for any_errors_fatal 30529 1726882695.45902: done checking for any_errors_fatal 30529 1726882695.45903: checking for max_fail_percentage 30529 1726882695.45905: done checking for max_fail_percentage 30529 1726882695.45906: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.45906: done checking to see if all hosts have failed 30529 1726882695.45907: getting the remaining hosts for this loop 30529 1726882695.45909: done getting the remaining hosts for this loop 30529 1726882695.45912: getting the next task for host managed_node1 30529 1726882695.45921: done getting next task for host managed_node1 30529 1726882695.45924: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882695.45929: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.45949: getting variables 30529 1726882695.45951: in VariableManager get_vars() 30529 1726882695.45994: Calling all_inventory to load vars for managed_node1 30529 1726882695.45997: Calling groups_inventory to load vars for managed_node1 30529 1726882695.45999: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.46008: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.46010: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.46013: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.46565: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002331 30529 1726882695.46568: WORKER PROCESS EXITING 30529 1726882695.46791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.47796: done with get_vars() 30529 1726882695.47811: done getting variables 30529 1726882695.47853: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:15 -0400 (0:00:00.032) 0:01:49.504 ****** 30529 1726882695.47876: entering _queue_task() for managed_node1/package 30529 1726882695.48119: worker is 1 (out of 1 available) 30529 1726882695.48134: exiting _queue_task() for managed_node1/package 30529 1726882695.48147: done queuing things up, now waiting for results queue to drain 30529 1726882695.48148: waiting for pending results... 30529 1726882695.48342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882695.48446: in run() - task 12673a56-9f93-b0f1-edc0-000000002332 30529 1726882695.48458: variable 'ansible_search_path' from source: unknown 30529 1726882695.48461: variable 'ansible_search_path' from source: unknown 30529 1726882695.48490: calling self._execute() 30529 1726882695.48566: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.48569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.48577: variable 'omit' from source: magic vars 30529 1726882695.48867: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.48877: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.48966: variable 'network_state' from source: role '' defaults 30529 1726882695.48975: Evaluated conditional (network_state != {}): False 30529 1726882695.48978: when evaluation is False, skipping this task 30529 1726882695.48982: _execute() done 30529 1726882695.48985: dumping result to json 30529 1726882695.48987: done dumping result, returning 30529 1726882695.49033: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-000000002332] 30529 1726882695.49037: sending task result for task 12673a56-9f93-b0f1-edc0-000000002332 30529 1726882695.49102: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002332 30529 1726882695.49104: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882695.49172: no more pending results, returning what we have 30529 1726882695.49176: results queue empty 30529 1726882695.49177: checking for any_errors_fatal 30529 1726882695.49184: done checking for any_errors_fatal 30529 1726882695.49184: checking for max_fail_percentage 30529 1726882695.49186: done checking for max_fail_percentage 30529 1726882695.49187: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.49187: done checking to see if all hosts have failed 30529 1726882695.49188: getting the remaining hosts for this loop 30529 1726882695.49190: done getting the remaining hosts for this loop 30529 1726882695.49200: getting the next task for host managed_node1 30529 1726882695.49208: done getting next task for host managed_node1 30529 1726882695.49211: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882695.49217: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.49239: getting variables 30529 1726882695.49241: in VariableManager get_vars() 30529 1726882695.49276: Calling all_inventory to load vars for managed_node1 30529 1726882695.49278: Calling groups_inventory to load vars for managed_node1 30529 1726882695.49280: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.49289: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.49291: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.49296: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.50071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.50948: done with get_vars() 30529 1726882695.50963: done getting variables 30529 1726882695.51007: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:15 -0400 (0:00:00.031) 0:01:49.536 ****** 30529 1726882695.51032: entering _queue_task() for managed_node1/service 30529 1726882695.51258: worker is 1 (out of 1 available) 30529 1726882695.51272: exiting _queue_task() for managed_node1/service 30529 1726882695.51286: done queuing things up, now waiting for results queue to drain 30529 1726882695.51287: waiting for pending results... 30529 1726882695.51473: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882695.51574: in run() - task 12673a56-9f93-b0f1-edc0-000000002333 30529 1726882695.51585: variable 'ansible_search_path' from source: unknown 30529 1726882695.51590: variable 'ansible_search_path' from source: unknown 30529 1726882695.51623: calling self._execute() 30529 1726882695.51697: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.51701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.51709: variable 'omit' from source: magic vars 30529 1726882695.51991: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.52004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.52089: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.52223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.53761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.53820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.53851: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.53876: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.53903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.53962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.53984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.54010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.54037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.54049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.54080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.54100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.54198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.54201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.54204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.54207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.54210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.54212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.54240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.54251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.54369: variable 'network_connections' from source: include params 30529 1726882695.54378: variable 'interface' from source: play vars 30529 1726882695.54430: variable 'interface' from source: play vars 30529 1726882695.54480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.54591: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.54911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.54933: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.54954: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.54984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.55004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.55023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.55040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.55077: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882695.55228: variable 'network_connections' from source: include params 30529 1726882695.55231: variable 'interface' from source: play vars 30529 1726882695.55273: variable 'interface' from source: play vars 30529 1726882695.55294: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882695.55299: when evaluation is False, skipping this task 30529 1726882695.55302: _execute() done 30529 1726882695.55304: dumping result to json 30529 1726882695.55307: done dumping result, returning 30529 1726882695.55310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000002333] 30529 1726882695.55321: sending task result for task 12673a56-9f93-b0f1-edc0-000000002333 30529 1726882695.55407: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002333 30529 1726882695.55416: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882695.55473: no more pending results, returning what we have 30529 1726882695.55477: results queue empty 30529 1726882695.55478: checking for any_errors_fatal 30529 1726882695.55483: done checking for any_errors_fatal 30529 1726882695.55484: checking for max_fail_percentage 30529 1726882695.55485: done checking for max_fail_percentage 30529 1726882695.55486: checking to see if all hosts have failed and the running result is not ok 30529 1726882695.55490: done checking to see if all hosts have failed 30529 1726882695.55490: getting the remaining hosts for this loop 30529 1726882695.55492: done getting the remaining hosts for this loop 30529 1726882695.55498: getting the next task for host managed_node1 30529 1726882695.55507: done getting next task for host managed_node1 30529 1726882695.55511: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882695.55516: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882695.55541: getting variables 30529 1726882695.55543: in VariableManager get_vars() 30529 1726882695.55592: Calling all_inventory to load vars for managed_node1 30529 1726882695.55601: Calling groups_inventory to load vars for managed_node1 30529 1726882695.55604: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882695.55613: Calling all_plugins_play to load vars for managed_node1 30529 1726882695.55616: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882695.55619: Calling groups_plugins_play to load vars for managed_node1 30529 1726882695.56636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882695.57497: done with get_vars() 30529 1726882695.57513: done getting variables 30529 1726882695.57557: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:15 -0400 (0:00:00.065) 0:01:49.601 ****** 30529 1726882695.57581: entering _queue_task() for managed_node1/service 30529 1726882695.57830: worker is 1 (out of 1 available) 30529 1726882695.57843: exiting _queue_task() for managed_node1/service 30529 1726882695.57856: done queuing things up, now waiting for results queue to drain 30529 1726882695.57858: waiting for pending results... 30529 1726882695.58059: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882695.58164: in run() - task 12673a56-9f93-b0f1-edc0-000000002334 30529 1726882695.58177: variable 'ansible_search_path' from source: unknown 30529 1726882695.58180: variable 'ansible_search_path' from source: unknown 30529 1726882695.58214: calling self._execute() 30529 1726882695.58286: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.58295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.58305: variable 'omit' from source: magic vars 30529 1726882695.58607: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.58616: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882695.58730: variable 'network_provider' from source: set_fact 30529 1726882695.58734: variable 'network_state' from source: role '' defaults 30529 1726882695.58748: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882695.58750: variable 'omit' from source: magic vars 30529 1726882695.58790: variable 'omit' from source: magic vars 30529 1726882695.58815: variable 'network_service_name' from source: role '' defaults 30529 1726882695.58864: variable 'network_service_name' from source: role '' defaults 30529 1726882695.58936: variable '__network_provider_setup' from source: role '' defaults 30529 1726882695.58941: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882695.58988: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882695.59000: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882695.59042: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882695.59191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882695.60679: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882695.60741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882695.60768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882695.60804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882695.60823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882695.60881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.60907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.60927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.60952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.60962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.60997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.61013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.61033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.61057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.61067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.61218: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882695.61294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.61312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.61328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.61355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.61366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.61428: variable 'ansible_python' from source: facts 30529 1726882695.61440: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882695.61499: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882695.61550: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882695.61635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.61652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.61670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.61701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.61711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.61743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882695.61762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882695.61778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.61810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882695.61821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882695.61911: variable 'network_connections' from source: include params 30529 1726882695.61917: variable 'interface' from source: play vars 30529 1726882695.61968: variable 'interface' from source: play vars 30529 1726882695.62044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882695.62174: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882695.62212: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882695.62245: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882695.62273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882695.62318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882695.62341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882695.62363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882695.62385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882695.62426: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.62604: variable 'network_connections' from source: include params 30529 1726882695.62610: variable 'interface' from source: play vars 30529 1726882695.62665: variable 'interface' from source: play vars 30529 1726882695.62684: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882695.62740: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882695.62926: variable 'network_connections' from source: include params 30529 1726882695.62929: variable 'interface' from source: play vars 30529 1726882695.62977: variable 'interface' from source: play vars 30529 1726882695.62999: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882695.63051: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882695.63235: variable 'network_connections' from source: include params 30529 1726882695.63238: variable 'interface' from source: play vars 30529 1726882695.63288: variable 'interface' from source: play vars 30529 1726882695.63330: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882695.63371: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882695.63377: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882695.63426: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882695.63558: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882695.63871: variable 'network_connections' from source: include params 30529 1726882695.63874: variable 'interface' from source: play vars 30529 1726882695.63919: variable 'interface' from source: play vars 30529 1726882695.63926: variable 'ansible_distribution' from source: facts 30529 1726882695.63928: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.63935: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.63945: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882695.64060: variable 'ansible_distribution' from source: facts 30529 1726882695.64063: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.64067: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.64081: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882695.64192: variable 'ansible_distribution' from source: facts 30529 1726882695.64201: variable '__network_rh_distros' from source: role '' defaults 30529 1726882695.64206: variable 'ansible_distribution_major_version' from source: facts 30529 1726882695.64231: variable 'network_provider' from source: set_fact 30529 1726882695.64248: variable 'omit' from source: magic vars 30529 1726882695.64268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882695.64292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882695.64311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882695.64324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882695.64333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882695.64355: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882695.64358: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.64360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.64436: Set connection var ansible_shell_executable to /bin/sh 30529 1726882695.64439: Set connection var ansible_pipelining to False 30529 1726882695.64442: Set connection var ansible_shell_type to sh 30529 1726882695.64450: Set connection var ansible_timeout to 10 30529 1726882695.64452: Set connection var ansible_connection to ssh 30529 1726882695.64456: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882695.64475: variable 'ansible_shell_executable' from source: unknown 30529 1726882695.64478: variable 'ansible_connection' from source: unknown 30529 1726882695.64480: variable 'ansible_module_compression' from source: unknown 30529 1726882695.64482: variable 'ansible_shell_type' from source: unknown 30529 1726882695.64484: variable 'ansible_shell_executable' from source: unknown 30529 1726882695.64491: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882695.64499: variable 'ansible_pipelining' from source: unknown 30529 1726882695.64502: variable 'ansible_timeout' from source: unknown 30529 1726882695.64504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882695.64574: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882695.64582: variable 'omit' from source: magic vars 30529 1726882695.64591: starting attempt loop 30529 1726882695.64595: running the handler 30529 1726882695.64647: variable 'ansible_facts' from source: unknown 30529 1726882695.65127: _low_level_execute_command(): starting 30529 1726882695.65133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882695.65629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.65633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.65636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882695.65639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.65690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882695.65696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882695.65757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882695.67440: stdout chunk (state=3): >>>/root <<< 30529 1726882695.67541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882695.67568: stderr chunk (state=3): >>><<< 30529 1726882695.67572: stdout chunk (state=3): >>><<< 30529 1726882695.67598: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882695.67608: _low_level_execute_command(): starting 30529 1726882695.67614: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679 `" && echo ansible-tmp-1726882695.6759698-35597-236771710209679="` echo /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679 `" ) && sleep 0' 30529 1726882695.68050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.68053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.68055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882695.68057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.68060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.68099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882695.68113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882695.68157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882695.70015: stdout chunk (state=3): >>>ansible-tmp-1726882695.6759698-35597-236771710209679=/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679 <<< 30529 1726882695.70123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882695.70149: stderr chunk (state=3): >>><<< 30529 1726882695.70152: stdout chunk (state=3): >>><<< 30529 1726882695.70166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882695.6759698-35597-236771710209679=/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882695.70197: variable 'ansible_module_compression' from source: unknown 30529 1726882695.70236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882695.70285: variable 'ansible_facts' from source: unknown 30529 1726882695.70425: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py 30529 1726882695.70521: Sending initial data 30529 1726882695.70525: Sent initial data (156 bytes) 30529 1726882695.70973: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.70976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882695.70983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.70985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.70987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.71035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882695.71038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882695.71085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882695.72592: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882695.72602: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882695.72633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882695.72674: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmph29_2q1r /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py <<< 30529 1726882695.72680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py" <<< 30529 1726882695.72716: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmph29_2q1r" to remote "/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py" <<< 30529 1726882695.73760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882695.73798: stderr chunk (state=3): >>><<< 30529 1726882695.73801: stdout chunk (state=3): >>><<< 30529 1726882695.73839: done transferring module to remote 30529 1726882695.73847: _low_level_execute_command(): starting 30529 1726882695.73851: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/ /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py && sleep 0' 30529 1726882695.74278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882695.74281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.74284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882695.74286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882695.74290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.74336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882695.74343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882695.74345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882695.74385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882695.76071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882695.76100: stderr chunk (state=3): >>><<< 30529 1726882695.76103: stdout chunk (state=3): >>><<< 30529 1726882695.76113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882695.76116: _low_level_execute_command(): starting 30529 1726882695.76121: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/AnsiballZ_systemd.py && sleep 0' 30529 1726882695.76522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882695.76527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.76548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882695.76598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882695.76601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882695.76608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882695.76654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.05332: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10817536", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3294957568", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1954173000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882696.05345: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882696.05354: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882696.07115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882696.07144: stderr chunk (state=3): >>><<< 30529 1726882696.07147: stdout chunk (state=3): >>><<< 30529 1726882696.07164: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10817536", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3294957568", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1954173000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882696.07286: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882696.07309: _low_level_execute_command(): starting 30529 1726882696.07312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882695.6759698-35597-236771710209679/ > /dev/null 2>&1 && sleep 0' 30529 1726882696.07760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.07763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.07765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.07767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.07821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.07824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882696.07832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.07871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.09647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.09671: stderr chunk (state=3): >>><<< 30529 1726882696.09674: stdout chunk (state=3): >>><<< 30529 1726882696.09690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.09695: handler run complete 30529 1726882696.09732: attempt loop complete, returning result 30529 1726882696.09735: _execute() done 30529 1726882696.09737: dumping result to json 30529 1726882696.09749: done dumping result, returning 30529 1726882696.09757: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-000000002334] 30529 1726882696.09766: sending task result for task 12673a56-9f93-b0f1-edc0-000000002334 30529 1726882696.10022: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002334 30529 1726882696.10025: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882696.10077: no more pending results, returning what we have 30529 1726882696.10079: results queue empty 30529 1726882696.10080: checking for any_errors_fatal 30529 1726882696.10086: done checking for any_errors_fatal 30529 1726882696.10089: checking for max_fail_percentage 30529 1726882696.10091: done checking for max_fail_percentage 30529 1726882696.10092: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.10094: done checking to see if all hosts have failed 30529 1726882696.10095: getting the remaining hosts for this loop 30529 1726882696.10097: done getting the remaining hosts for this loop 30529 1726882696.10101: getting the next task for host managed_node1 30529 1726882696.10108: done getting next task for host managed_node1 30529 1726882696.10111: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882696.10116: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.10127: getting variables 30529 1726882696.10129: in VariableManager get_vars() 30529 1726882696.10166: Calling all_inventory to load vars for managed_node1 30529 1726882696.10169: Calling groups_inventory to load vars for managed_node1 30529 1726882696.10171: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.10180: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.10183: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.10185: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.11051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.12334: done with get_vars() 30529 1726882696.12356: done getting variables 30529 1726882696.12404: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:16 -0400 (0:00:00.548) 0:01:50.150 ****** 30529 1726882696.12432: entering _queue_task() for managed_node1/service 30529 1726882696.12677: worker is 1 (out of 1 available) 30529 1726882696.12695: exiting _queue_task() for managed_node1/service 30529 1726882696.12708: done queuing things up, now waiting for results queue to drain 30529 1726882696.12709: waiting for pending results... 30529 1726882696.12896: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882696.12995: in run() - task 12673a56-9f93-b0f1-edc0-000000002335 30529 1726882696.13003: variable 'ansible_search_path' from source: unknown 30529 1726882696.13007: variable 'ansible_search_path' from source: unknown 30529 1726882696.13035: calling self._execute() 30529 1726882696.13111: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.13115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.13123: variable 'omit' from source: magic vars 30529 1726882696.13409: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.13418: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.13499: variable 'network_provider' from source: set_fact 30529 1726882696.13502: Evaluated conditional (network_provider == "nm"): True 30529 1726882696.13566: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882696.13632: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882696.13749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882696.15999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882696.16003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882696.16043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882696.16119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882696.16122: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882696.16206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882696.16246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882696.16278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882696.16327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882696.16401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882696.16404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882696.16434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882696.16519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882696.16523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882696.16525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882696.16569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882696.16600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882696.16634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882696.16676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882696.16699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882696.16855: variable 'network_connections' from source: include params 30529 1726882696.16872: variable 'interface' from source: play vars 30529 1726882696.16944: variable 'interface' from source: play vars 30529 1726882696.17064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882696.17204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882696.17246: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882696.17287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882696.17323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882696.17390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882696.17398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882696.17429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882696.17461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882696.17525: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882696.17686: variable 'network_connections' from source: include params 30529 1726882696.17692: variable 'interface' from source: play vars 30529 1726882696.17735: variable 'interface' from source: play vars 30529 1726882696.17757: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882696.17763: when evaluation is False, skipping this task 30529 1726882696.17765: _execute() done 30529 1726882696.17768: dumping result to json 30529 1726882696.17771: done dumping result, returning 30529 1726882696.17779: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-000000002335] 30529 1726882696.17789: sending task result for task 12673a56-9f93-b0f1-edc0-000000002335 30529 1726882696.17874: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002335 30529 1726882696.17877: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882696.17924: no more pending results, returning what we have 30529 1726882696.17927: results queue empty 30529 1726882696.17928: checking for any_errors_fatal 30529 1726882696.17949: done checking for any_errors_fatal 30529 1726882696.17949: checking for max_fail_percentage 30529 1726882696.17951: done checking for max_fail_percentage 30529 1726882696.17952: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.17953: done checking to see if all hosts have failed 30529 1726882696.17954: getting the remaining hosts for this loop 30529 1726882696.17955: done getting the remaining hosts for this loop 30529 1726882696.17959: getting the next task for host managed_node1 30529 1726882696.17968: done getting next task for host managed_node1 30529 1726882696.17972: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882696.17977: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.18001: getting variables 30529 1726882696.18003: in VariableManager get_vars() 30529 1726882696.18050: Calling all_inventory to load vars for managed_node1 30529 1726882696.18052: Calling groups_inventory to load vars for managed_node1 30529 1726882696.18054: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.18064: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.18066: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.18068: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.19009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.20297: done with get_vars() 30529 1726882696.20313: done getting variables 30529 1726882696.20354: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:16 -0400 (0:00:00.079) 0:01:50.229 ****** 30529 1726882696.20378: entering _queue_task() for managed_node1/service 30529 1726882696.20611: worker is 1 (out of 1 available) 30529 1726882696.20624: exiting _queue_task() for managed_node1/service 30529 1726882696.20638: done queuing things up, now waiting for results queue to drain 30529 1726882696.20639: waiting for pending results... 30529 1726882696.20823: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882696.20914: in run() - task 12673a56-9f93-b0f1-edc0-000000002336 30529 1726882696.20926: variable 'ansible_search_path' from source: unknown 30529 1726882696.20929: variable 'ansible_search_path' from source: unknown 30529 1726882696.20956: calling self._execute() 30529 1726882696.21035: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.21039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.21047: variable 'omit' from source: magic vars 30529 1726882696.21324: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.21333: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.21413: variable 'network_provider' from source: set_fact 30529 1726882696.21417: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882696.21419: when evaluation is False, skipping this task 30529 1726882696.21422: _execute() done 30529 1726882696.21424: dumping result to json 30529 1726882696.21427: done dumping result, returning 30529 1726882696.21435: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-000000002336] 30529 1726882696.21439: sending task result for task 12673a56-9f93-b0f1-edc0-000000002336 30529 1726882696.21524: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002336 30529 1726882696.21527: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882696.21573: no more pending results, returning what we have 30529 1726882696.21576: results queue empty 30529 1726882696.21577: checking for any_errors_fatal 30529 1726882696.21584: done checking for any_errors_fatal 30529 1726882696.21585: checking for max_fail_percentage 30529 1726882696.21586: done checking for max_fail_percentage 30529 1726882696.21588: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.21588: done checking to see if all hosts have failed 30529 1726882696.21589: getting the remaining hosts for this loop 30529 1726882696.21591: done getting the remaining hosts for this loop 30529 1726882696.21596: getting the next task for host managed_node1 30529 1726882696.21603: done getting next task for host managed_node1 30529 1726882696.21606: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882696.21611: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.21631: getting variables 30529 1726882696.21633: in VariableManager get_vars() 30529 1726882696.21672: Calling all_inventory to load vars for managed_node1 30529 1726882696.21674: Calling groups_inventory to load vars for managed_node1 30529 1726882696.21676: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.21684: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.21687: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.21689: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.22777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.23965: done with get_vars() 30529 1726882696.23981: done getting variables 30529 1726882696.24024: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:16 -0400 (0:00:00.036) 0:01:50.266 ****** 30529 1726882696.24048: entering _queue_task() for managed_node1/copy 30529 1726882696.24255: worker is 1 (out of 1 available) 30529 1726882696.24268: exiting _queue_task() for managed_node1/copy 30529 1726882696.24281: done queuing things up, now waiting for results queue to drain 30529 1726882696.24282: waiting for pending results... 30529 1726882696.24469: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882696.24567: in run() - task 12673a56-9f93-b0f1-edc0-000000002337 30529 1726882696.24577: variable 'ansible_search_path' from source: unknown 30529 1726882696.24580: variable 'ansible_search_path' from source: unknown 30529 1726882696.24614: calling self._execute() 30529 1726882696.24685: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.24692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.24701: variable 'omit' from source: magic vars 30529 1726882696.25199: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.25202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.25204: variable 'network_provider' from source: set_fact 30529 1726882696.25207: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882696.25212: when evaluation is False, skipping this task 30529 1726882696.25214: _execute() done 30529 1726882696.25216: dumping result to json 30529 1726882696.25218: done dumping result, returning 30529 1726882696.25222: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-000000002337] 30529 1726882696.25224: sending task result for task 12673a56-9f93-b0f1-edc0-000000002337 30529 1726882696.25310: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002337 30529 1726882696.25316: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882696.25382: no more pending results, returning what we have 30529 1726882696.25386: results queue empty 30529 1726882696.25387: checking for any_errors_fatal 30529 1726882696.25396: done checking for any_errors_fatal 30529 1726882696.25397: checking for max_fail_percentage 30529 1726882696.25399: done checking for max_fail_percentage 30529 1726882696.25400: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.25401: done checking to see if all hosts have failed 30529 1726882696.25402: getting the remaining hosts for this loop 30529 1726882696.25403: done getting the remaining hosts for this loop 30529 1726882696.25407: getting the next task for host managed_node1 30529 1726882696.25417: done getting next task for host managed_node1 30529 1726882696.25421: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882696.25426: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.25450: getting variables 30529 1726882696.25453: in VariableManager get_vars() 30529 1726882696.25551: Calling all_inventory to load vars for managed_node1 30529 1726882696.25554: Calling groups_inventory to load vars for managed_node1 30529 1726882696.25557: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.25570: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.25573: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.25576: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.26785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.27657: done with get_vars() 30529 1726882696.27672: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:16 -0400 (0:00:00.036) 0:01:50.303 ****** 30529 1726882696.27734: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882696.27940: worker is 1 (out of 1 available) 30529 1726882696.27954: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882696.27967: done queuing things up, now waiting for results queue to drain 30529 1726882696.27968: waiting for pending results... 30529 1726882696.28159: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882696.28256: in run() - task 12673a56-9f93-b0f1-edc0-000000002338 30529 1726882696.28269: variable 'ansible_search_path' from source: unknown 30529 1726882696.28272: variable 'ansible_search_path' from source: unknown 30529 1726882696.28303: calling self._execute() 30529 1726882696.28374: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.28378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.28386: variable 'omit' from source: magic vars 30529 1726882696.28898: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.28902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.28904: variable 'omit' from source: magic vars 30529 1726882696.28907: variable 'omit' from source: magic vars 30529 1726882696.28975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882696.30516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882696.30561: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882696.30587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882696.30616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882696.30636: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882696.30696: variable 'network_provider' from source: set_fact 30529 1726882696.30779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882696.30802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882696.30820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882696.30845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882696.30856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882696.30913: variable 'omit' from source: magic vars 30529 1726882696.30981: variable 'omit' from source: magic vars 30529 1726882696.31053: variable 'network_connections' from source: include params 30529 1726882696.31063: variable 'interface' from source: play vars 30529 1726882696.31113: variable 'interface' from source: play vars 30529 1726882696.31218: variable 'omit' from source: magic vars 30529 1726882696.31223: variable '__lsr_ansible_managed' from source: task vars 30529 1726882696.31264: variable '__lsr_ansible_managed' from source: task vars 30529 1726882696.31670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882696.31805: Loaded config def from plugin (lookup/template) 30529 1726882696.31809: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882696.31828: File lookup term: get_ansible_managed.j2 30529 1726882696.31831: variable 'ansible_search_path' from source: unknown 30529 1726882696.31834: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882696.31845: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882696.31860: variable 'ansible_search_path' from source: unknown 30529 1726882696.34985: variable 'ansible_managed' from source: unknown 30529 1726882696.35060: variable 'omit' from source: magic vars 30529 1726882696.35079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882696.35099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882696.35114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882696.35129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.35137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.35156: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882696.35159: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.35162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.35226: Set connection var ansible_shell_executable to /bin/sh 30529 1726882696.35229: Set connection var ansible_pipelining to False 30529 1726882696.35232: Set connection var ansible_shell_type to sh 30529 1726882696.35241: Set connection var ansible_timeout to 10 30529 1726882696.35244: Set connection var ansible_connection to ssh 30529 1726882696.35246: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882696.35262: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.35265: variable 'ansible_connection' from source: unknown 30529 1726882696.35268: variable 'ansible_module_compression' from source: unknown 30529 1726882696.35270: variable 'ansible_shell_type' from source: unknown 30529 1726882696.35272: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.35275: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.35281: variable 'ansible_pipelining' from source: unknown 30529 1726882696.35283: variable 'ansible_timeout' from source: unknown 30529 1726882696.35286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.35371: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882696.35382: variable 'omit' from source: magic vars 30529 1726882696.35385: starting attempt loop 30529 1726882696.35390: running the handler 30529 1726882696.35401: _low_level_execute_command(): starting 30529 1726882696.35406: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882696.35902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.35906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.35922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.35964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.35967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882696.35969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.36023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.37610: stdout chunk (state=3): >>>/root <<< 30529 1726882696.37710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.37741: stderr chunk (state=3): >>><<< 30529 1726882696.37744: stdout chunk (state=3): >>><<< 30529 1726882696.37758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.37768: _low_level_execute_command(): starting 30529 1726882696.37773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538 `" && echo ansible-tmp-1726882696.3775835-35618-279524747930538="` echo /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538 `" ) && sleep 0' 30529 1726882696.38187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.38190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.38195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.38197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.38246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.38250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.38301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.40149: stdout chunk (state=3): >>>ansible-tmp-1726882696.3775835-35618-279524747930538=/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538 <<< 30529 1726882696.40258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.40277: stderr chunk (state=3): >>><<< 30529 1726882696.40280: stdout chunk (state=3): >>><<< 30529 1726882696.40297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882696.3775835-35618-279524747930538=/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.40329: variable 'ansible_module_compression' from source: unknown 30529 1726882696.40363: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882696.40400: variable 'ansible_facts' from source: unknown 30529 1726882696.40482: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py 30529 1726882696.40571: Sending initial data 30529 1726882696.40575: Sent initial data (168 bytes) 30529 1726882696.41009: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.41014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882696.41022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.41026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882696.41028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.41066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.41070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.41116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.42628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882696.42632: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882696.42666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882696.42710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp4_hbkkh9 /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py <<< 30529 1726882696.42713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py" <<< 30529 1726882696.42754: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp4_hbkkh9" to remote "/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py" <<< 30529 1726882696.43440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.43477: stderr chunk (state=3): >>><<< 30529 1726882696.43480: stdout chunk (state=3): >>><<< 30529 1726882696.43514: done transferring module to remote 30529 1726882696.43522: _low_level_execute_command(): starting 30529 1726882696.43526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/ /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py && sleep 0' 30529 1726882696.43948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.43952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882696.43954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.43956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882696.43958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.43960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.44010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.44017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.44057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.45753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.45774: stderr chunk (state=3): >>><<< 30529 1726882696.45777: stdout chunk (state=3): >>><<< 30529 1726882696.45792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.45803: _low_level_execute_command(): starting 30529 1726882696.45806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/AnsiballZ_network_connections.py && sleep 0' 30529 1726882696.46211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.46214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.46217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882696.46219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.46221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882696.46222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.46268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.46271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.46322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.70911: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882696.72588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882696.72618: stderr chunk (state=3): >>><<< 30529 1726882696.72621: stdout chunk (state=3): >>><<< 30529 1726882696.72638: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882696.72670: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882696.72677: _low_level_execute_command(): starting 30529 1726882696.72681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882696.3775835-35618-279524747930538/ > /dev/null 2>&1 && sleep 0' 30529 1726882696.73132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.73136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882696.73138: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.73140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.73142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882696.73144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.73197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.73200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882696.73208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.73246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.75024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.75048: stderr chunk (state=3): >>><<< 30529 1726882696.75051: stdout chunk (state=3): >>><<< 30529 1726882696.75062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.75068: handler run complete 30529 1726882696.75086: attempt loop complete, returning result 30529 1726882696.75088: _execute() done 30529 1726882696.75095: dumping result to json 30529 1726882696.75100: done dumping result, returning 30529 1726882696.75109: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-000000002338] 30529 1726882696.75118: sending task result for task 12673a56-9f93-b0f1-edc0-000000002338 30529 1726882696.75217: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002338 30529 1726882696.75220: WORKER PROCESS EXITING ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active 30529 1726882696.75336: no more pending results, returning what we have 30529 1726882696.75339: results queue empty 30529 1726882696.75340: checking for any_errors_fatal 30529 1726882696.75346: done checking for any_errors_fatal 30529 1726882696.75347: checking for max_fail_percentage 30529 1726882696.75349: done checking for max_fail_percentage 30529 1726882696.75349: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.75352: done checking to see if all hosts have failed 30529 1726882696.75353: getting the remaining hosts for this loop 30529 1726882696.75354: done getting the remaining hosts for this loop 30529 1726882696.75358: getting the next task for host managed_node1 30529 1726882696.75365: done getting next task for host managed_node1 30529 1726882696.75368: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882696.75374: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.75385: getting variables 30529 1726882696.75387: in VariableManager get_vars() 30529 1726882696.75429: Calling all_inventory to load vars for managed_node1 30529 1726882696.75432: Calling groups_inventory to load vars for managed_node1 30529 1726882696.75434: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.75443: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.75445: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.75448: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.76440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.77277: done with get_vars() 30529 1726882696.77297: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:16 -0400 (0:00:00.496) 0:01:50.799 ****** 30529 1726882696.77358: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882696.77588: worker is 1 (out of 1 available) 30529 1726882696.77602: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882696.77616: done queuing things up, now waiting for results queue to drain 30529 1726882696.77617: waiting for pending results... 30529 1726882696.77804: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882696.77914: in run() - task 12673a56-9f93-b0f1-edc0-000000002339 30529 1726882696.77927: variable 'ansible_search_path' from source: unknown 30529 1726882696.77931: variable 'ansible_search_path' from source: unknown 30529 1726882696.77959: calling self._execute() 30529 1726882696.78199: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.78203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.78206: variable 'omit' from source: magic vars 30529 1726882696.78435: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.78452: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.78582: variable 'network_state' from source: role '' defaults 30529 1726882696.78605: Evaluated conditional (network_state != {}): False 30529 1726882696.78615: when evaluation is False, skipping this task 30529 1726882696.78623: _execute() done 30529 1726882696.78631: dumping result to json 30529 1726882696.78639: done dumping result, returning 30529 1726882696.78651: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-000000002339] 30529 1726882696.78662: sending task result for task 12673a56-9f93-b0f1-edc0-000000002339 30529 1726882696.78772: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002339 30529 1726882696.78781: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882696.78844: no more pending results, returning what we have 30529 1726882696.78848: results queue empty 30529 1726882696.78849: checking for any_errors_fatal 30529 1726882696.78862: done checking for any_errors_fatal 30529 1726882696.78863: checking for max_fail_percentage 30529 1726882696.78865: done checking for max_fail_percentage 30529 1726882696.78865: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.78866: done checking to see if all hosts have failed 30529 1726882696.78867: getting the remaining hosts for this loop 30529 1726882696.78869: done getting the remaining hosts for this loop 30529 1726882696.78873: getting the next task for host managed_node1 30529 1726882696.78881: done getting next task for host managed_node1 30529 1726882696.78884: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882696.78890: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.78918: getting variables 30529 1726882696.78920: in VariableManager get_vars() 30529 1726882696.78962: Calling all_inventory to load vars for managed_node1 30529 1726882696.78964: Calling groups_inventory to load vars for managed_node1 30529 1726882696.78966: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.78976: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.78979: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.78981: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.80131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.80985: done with get_vars() 30529 1726882696.81005: done getting variables 30529 1726882696.81048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:16 -0400 (0:00:00.037) 0:01:50.836 ****** 30529 1726882696.81073: entering _queue_task() for managed_node1/debug 30529 1726882696.81309: worker is 1 (out of 1 available) 30529 1726882696.81322: exiting _queue_task() for managed_node1/debug 30529 1726882696.81335: done queuing things up, now waiting for results queue to drain 30529 1726882696.81337: waiting for pending results... 30529 1726882696.81520: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882696.81618: in run() - task 12673a56-9f93-b0f1-edc0-00000000233a 30529 1726882696.81633: variable 'ansible_search_path' from source: unknown 30529 1726882696.81637: variable 'ansible_search_path' from source: unknown 30529 1726882696.81663: calling self._execute() 30529 1726882696.81743: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.81746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.81755: variable 'omit' from source: magic vars 30529 1726882696.82027: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.82037: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.82043: variable 'omit' from source: magic vars 30529 1726882696.82087: variable 'omit' from source: magic vars 30529 1726882696.82117: variable 'omit' from source: magic vars 30529 1726882696.82146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882696.82173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882696.82192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882696.82206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.82216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.82241: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882696.82244: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.82246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.82318: Set connection var ansible_shell_executable to /bin/sh 30529 1726882696.82322: Set connection var ansible_pipelining to False 30529 1726882696.82324: Set connection var ansible_shell_type to sh 30529 1726882696.82334: Set connection var ansible_timeout to 10 30529 1726882696.82336: Set connection var ansible_connection to ssh 30529 1726882696.82339: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882696.82356: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.82359: variable 'ansible_connection' from source: unknown 30529 1726882696.82362: variable 'ansible_module_compression' from source: unknown 30529 1726882696.82364: variable 'ansible_shell_type' from source: unknown 30529 1726882696.82366: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.82368: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.82371: variable 'ansible_pipelining' from source: unknown 30529 1726882696.82374: variable 'ansible_timeout' from source: unknown 30529 1726882696.82378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.82478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882696.82490: variable 'omit' from source: magic vars 30529 1726882696.82495: starting attempt loop 30529 1726882696.82498: running the handler 30529 1726882696.82589: variable '__network_connections_result' from source: set_fact 30529 1726882696.82628: handler run complete 30529 1726882696.82640: attempt loop complete, returning result 30529 1726882696.82643: _execute() done 30529 1726882696.82646: dumping result to json 30529 1726882696.82648: done dumping result, returning 30529 1726882696.82660: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000233a] 30529 1726882696.82663: sending task result for task 12673a56-9f93-b0f1-edc0-00000000233a 30529 1726882696.82743: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000233a 30529 1726882696.82746: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active" ] } 30529 1726882696.82827: no more pending results, returning what we have 30529 1726882696.82830: results queue empty 30529 1726882696.82831: checking for any_errors_fatal 30529 1726882696.82835: done checking for any_errors_fatal 30529 1726882696.82835: checking for max_fail_percentage 30529 1726882696.82837: done checking for max_fail_percentage 30529 1726882696.82838: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.82838: done checking to see if all hosts have failed 30529 1726882696.82839: getting the remaining hosts for this loop 30529 1726882696.82840: done getting the remaining hosts for this loop 30529 1726882696.82844: getting the next task for host managed_node1 30529 1726882696.82851: done getting next task for host managed_node1 30529 1726882696.82854: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882696.82859: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.82869: getting variables 30529 1726882696.82871: in VariableManager get_vars() 30529 1726882696.82910: Calling all_inventory to load vars for managed_node1 30529 1726882696.82912: Calling groups_inventory to load vars for managed_node1 30529 1726882696.82914: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.82922: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.82925: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.82927: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.83806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.84650: done with get_vars() 30529 1726882696.84666: done getting variables 30529 1726882696.84711: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:16 -0400 (0:00:00.036) 0:01:50.873 ****** 30529 1726882696.84740: entering _queue_task() for managed_node1/debug 30529 1726882696.84957: worker is 1 (out of 1 available) 30529 1726882696.84970: exiting _queue_task() for managed_node1/debug 30529 1726882696.84984: done queuing things up, now waiting for results queue to drain 30529 1726882696.84985: waiting for pending results... 30529 1726882696.85161: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882696.85257: in run() - task 12673a56-9f93-b0f1-edc0-00000000233b 30529 1726882696.85270: variable 'ansible_search_path' from source: unknown 30529 1726882696.85274: variable 'ansible_search_path' from source: unknown 30529 1726882696.85302: calling self._execute() 30529 1726882696.85372: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.85375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.85385: variable 'omit' from source: magic vars 30529 1726882696.85652: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.85662: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.85669: variable 'omit' from source: magic vars 30529 1726882696.85718: variable 'omit' from source: magic vars 30529 1726882696.85741: variable 'omit' from source: magic vars 30529 1726882696.85772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882696.85800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882696.85816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882696.85829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.85840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.85866: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882696.85869: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.85871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.85942: Set connection var ansible_shell_executable to /bin/sh 30529 1726882696.85945: Set connection var ansible_pipelining to False 30529 1726882696.85948: Set connection var ansible_shell_type to sh 30529 1726882696.85956: Set connection var ansible_timeout to 10 30529 1726882696.85959: Set connection var ansible_connection to ssh 30529 1726882696.85963: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882696.85982: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.85985: variable 'ansible_connection' from source: unknown 30529 1726882696.85990: variable 'ansible_module_compression' from source: unknown 30529 1726882696.85995: variable 'ansible_shell_type' from source: unknown 30529 1726882696.85997: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.85999: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.86002: variable 'ansible_pipelining' from source: unknown 30529 1726882696.86004: variable 'ansible_timeout' from source: unknown 30529 1726882696.86006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.86104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882696.86113: variable 'omit' from source: magic vars 30529 1726882696.86118: starting attempt loop 30529 1726882696.86121: running the handler 30529 1726882696.86158: variable '__network_connections_result' from source: set_fact 30529 1726882696.86215: variable '__network_connections_result' from source: set_fact 30529 1726882696.86286: handler run complete 30529 1726882696.86307: attempt loop complete, returning result 30529 1726882696.86311: _execute() done 30529 1726882696.86313: dumping result to json 30529 1726882696.86316: done dumping result, returning 30529 1726882696.86323: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-00000000233b] 30529 1726882696.86326: sending task result for task 12673a56-9f93-b0f1-edc0-00000000233b 30529 1726882696.86417: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000233b 30529 1726882696.86420: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 00b37fa6-807a-4f96-b822-2aecde64bf67 skipped because already active" ] } } 30529 1726882696.86503: no more pending results, returning what we have 30529 1726882696.86506: results queue empty 30529 1726882696.86507: checking for any_errors_fatal 30529 1726882696.86512: done checking for any_errors_fatal 30529 1726882696.86513: checking for max_fail_percentage 30529 1726882696.86514: done checking for max_fail_percentage 30529 1726882696.86515: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.86516: done checking to see if all hosts have failed 30529 1726882696.86516: getting the remaining hosts for this loop 30529 1726882696.86518: done getting the remaining hosts for this loop 30529 1726882696.86521: getting the next task for host managed_node1 30529 1726882696.86528: done getting next task for host managed_node1 30529 1726882696.86533: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882696.86537: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.86548: getting variables 30529 1726882696.86550: in VariableManager get_vars() 30529 1726882696.86582: Calling all_inventory to load vars for managed_node1 30529 1726882696.86585: Calling groups_inventory to load vars for managed_node1 30529 1726882696.86599: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.86607: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.86610: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.86612: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.87354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.88303: done with get_vars() 30529 1726882696.88318: done getting variables 30529 1726882696.88357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:16 -0400 (0:00:00.036) 0:01:50.909 ****** 30529 1726882696.88380: entering _queue_task() for managed_node1/debug 30529 1726882696.88594: worker is 1 (out of 1 available) 30529 1726882696.88609: exiting _queue_task() for managed_node1/debug 30529 1726882696.88622: done queuing things up, now waiting for results queue to drain 30529 1726882696.88624: waiting for pending results... 30529 1726882696.88796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882696.88869: in run() - task 12673a56-9f93-b0f1-edc0-00000000233c 30529 1726882696.88888: variable 'ansible_search_path' from source: unknown 30529 1726882696.88892: variable 'ansible_search_path' from source: unknown 30529 1726882696.88924: calling self._execute() 30529 1726882696.89005: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.89009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.89018: variable 'omit' from source: magic vars 30529 1726882696.89288: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.89301: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.89381: variable 'network_state' from source: role '' defaults 30529 1726882696.89390: Evaluated conditional (network_state != {}): False 30529 1726882696.89398: when evaluation is False, skipping this task 30529 1726882696.89403: _execute() done 30529 1726882696.89406: dumping result to json 30529 1726882696.89408: done dumping result, returning 30529 1726882696.89415: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-00000000233c] 30529 1726882696.89417: sending task result for task 12673a56-9f93-b0f1-edc0-00000000233c 30529 1726882696.89498: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000233c 30529 1726882696.89502: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882696.89548: no more pending results, returning what we have 30529 1726882696.89552: results queue empty 30529 1726882696.89553: checking for any_errors_fatal 30529 1726882696.89562: done checking for any_errors_fatal 30529 1726882696.89563: checking for max_fail_percentage 30529 1726882696.89565: done checking for max_fail_percentage 30529 1726882696.89566: checking to see if all hosts have failed and the running result is not ok 30529 1726882696.89567: done checking to see if all hosts have failed 30529 1726882696.89567: getting the remaining hosts for this loop 30529 1726882696.89569: done getting the remaining hosts for this loop 30529 1726882696.89572: getting the next task for host managed_node1 30529 1726882696.89580: done getting next task for host managed_node1 30529 1726882696.89583: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882696.89588: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882696.89609: getting variables 30529 1726882696.89611: in VariableManager get_vars() 30529 1726882696.89648: Calling all_inventory to load vars for managed_node1 30529 1726882696.89650: Calling groups_inventory to load vars for managed_node1 30529 1726882696.89652: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882696.89660: Calling all_plugins_play to load vars for managed_node1 30529 1726882696.89663: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882696.89665: Calling groups_plugins_play to load vars for managed_node1 30529 1726882696.90399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882696.91237: done with get_vars() 30529 1726882696.91253: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:16 -0400 (0:00:00.029) 0:01:50.939 ****** 30529 1726882696.91322: entering _queue_task() for managed_node1/ping 30529 1726882696.91544: worker is 1 (out of 1 available) 30529 1726882696.91558: exiting _queue_task() for managed_node1/ping 30529 1726882696.91570: done queuing things up, now waiting for results queue to drain 30529 1726882696.91572: waiting for pending results... 30529 1726882696.91756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882696.91841: in run() - task 12673a56-9f93-b0f1-edc0-00000000233d 30529 1726882696.91855: variable 'ansible_search_path' from source: unknown 30529 1726882696.91858: variable 'ansible_search_path' from source: unknown 30529 1726882696.91885: calling self._execute() 30529 1726882696.91961: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.91965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.91973: variable 'omit' from source: magic vars 30529 1726882696.92238: variable 'ansible_distribution_major_version' from source: facts 30529 1726882696.92249: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882696.92256: variable 'omit' from source: magic vars 30529 1726882696.92306: variable 'omit' from source: magic vars 30529 1726882696.92328: variable 'omit' from source: magic vars 30529 1726882696.92360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882696.92386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882696.92406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882696.92420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.92430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882696.92453: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882696.92459: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.92464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.92535: Set connection var ansible_shell_executable to /bin/sh 30529 1726882696.92539: Set connection var ansible_pipelining to False 30529 1726882696.92541: Set connection var ansible_shell_type to sh 30529 1726882696.92550: Set connection var ansible_timeout to 10 30529 1726882696.92552: Set connection var ansible_connection to ssh 30529 1726882696.92562: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882696.92576: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.92578: variable 'ansible_connection' from source: unknown 30529 1726882696.92581: variable 'ansible_module_compression' from source: unknown 30529 1726882696.92583: variable 'ansible_shell_type' from source: unknown 30529 1726882696.92585: variable 'ansible_shell_executable' from source: unknown 30529 1726882696.92590: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882696.92594: variable 'ansible_pipelining' from source: unknown 30529 1726882696.92597: variable 'ansible_timeout' from source: unknown 30529 1726882696.92599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882696.92742: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882696.92751: variable 'omit' from source: magic vars 30529 1726882696.92756: starting attempt loop 30529 1726882696.92759: running the handler 30529 1726882696.92770: _low_level_execute_command(): starting 30529 1726882696.92776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882696.93281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.93285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882696.93291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.93333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882696.93344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.93403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.95002: stdout chunk (state=3): >>>/root <<< 30529 1726882696.95097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.95124: stderr chunk (state=3): >>><<< 30529 1726882696.95127: stdout chunk (state=3): >>><<< 30529 1726882696.95147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.95158: _low_level_execute_command(): starting 30529 1726882696.95164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588 `" && echo ansible-tmp-1726882696.9514713-35641-41005423962588="` echo /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588 `" ) && sleep 0' 30529 1726882696.95602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.95605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.95607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.95618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.95659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.95662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.95711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882696.97558: stdout chunk (state=3): >>>ansible-tmp-1726882696.9514713-35641-41005423962588=/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588 <<< 30529 1726882696.97668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882696.97695: stderr chunk (state=3): >>><<< 30529 1726882696.97699: stdout chunk (state=3): >>><<< 30529 1726882696.97714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882696.9514713-35641-41005423962588=/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882696.97750: variable 'ansible_module_compression' from source: unknown 30529 1726882696.97780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882696.97809: variable 'ansible_facts' from source: unknown 30529 1726882696.97860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py 30529 1726882696.97954: Sending initial data 30529 1726882696.97958: Sent initial data (152 bytes) 30529 1726882696.98387: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882696.98391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882696.98395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.98397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882696.98400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882696.98451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882696.98455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882696.98499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.00007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882697.00042: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882697.00081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_trhgp0s /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py <<< 30529 1726882697.00089: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py" <<< 30529 1726882697.00124: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_trhgp0s" to remote "/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py" <<< 30529 1726882697.00637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.00674: stderr chunk (state=3): >>><<< 30529 1726882697.00677: stdout chunk (state=3): >>><<< 30529 1726882697.00717: done transferring module to remote 30529 1726882697.00726: _low_level_execute_command(): starting 30529 1726882697.00729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/ /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py && sleep 0' 30529 1726882697.01151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882697.01154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882697.01156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.01158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.01160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.01219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.01222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.01259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.02956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.02981: stderr chunk (state=3): >>><<< 30529 1726882697.02984: stdout chunk (state=3): >>><<< 30529 1726882697.03000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882697.03003: _low_level_execute_command(): starting 30529 1726882697.03007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/AnsiballZ_ping.py && sleep 0' 30529 1726882697.03440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.03443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.03445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882697.03447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882697.03449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.03497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.03501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.03554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.18282: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882697.19521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882697.19548: stderr chunk (state=3): >>><<< 30529 1726882697.19551: stdout chunk (state=3): >>><<< 30529 1726882697.19565: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882697.19591: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882697.19604: _low_level_execute_command(): starting 30529 1726882697.19608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882696.9514713-35641-41005423962588/ > /dev/null 2>&1 && sleep 0' 30529 1726882697.20054: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.20057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882697.20060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.20062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.20064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.20112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.20116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.20162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.21946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.21971: stderr chunk (state=3): >>><<< 30529 1726882697.21980: stdout chunk (state=3): >>><<< 30529 1726882697.21990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882697.21995: handler run complete 30529 1726882697.22008: attempt loop complete, returning result 30529 1726882697.22011: _execute() done 30529 1726882697.22014: dumping result to json 30529 1726882697.22017: done dumping result, returning 30529 1726882697.22025: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-00000000233d] 30529 1726882697.22029: sending task result for task 12673a56-9f93-b0f1-edc0-00000000233d 30529 1726882697.22120: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000233d 30529 1726882697.22122: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882697.22181: no more pending results, returning what we have 30529 1726882697.22185: results queue empty 30529 1726882697.22186: checking for any_errors_fatal 30529 1726882697.22196: done checking for any_errors_fatal 30529 1726882697.22197: checking for max_fail_percentage 30529 1726882697.22199: done checking for max_fail_percentage 30529 1726882697.22200: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.22201: done checking to see if all hosts have failed 30529 1726882697.22201: getting the remaining hosts for this loop 30529 1726882697.22203: done getting the remaining hosts for this loop 30529 1726882697.22206: getting the next task for host managed_node1 30529 1726882697.22217: done getting next task for host managed_node1 30529 1726882697.22219: ^ task is: TASK: meta (role_complete) 30529 1726882697.22224: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.22237: getting variables 30529 1726882697.22238: in VariableManager get_vars() 30529 1726882697.22286: Calling all_inventory to load vars for managed_node1 30529 1726882697.22290: Calling groups_inventory to load vars for managed_node1 30529 1726882697.22295: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.22306: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.22309: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.22312: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.23257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.28191: done with get_vars() 30529 1726882697.28213: done getting variables 30529 1726882697.28263: done queuing things up, now waiting for results queue to drain 30529 1726882697.28264: results queue empty 30529 1726882697.28265: checking for any_errors_fatal 30529 1726882697.28267: done checking for any_errors_fatal 30529 1726882697.28267: checking for max_fail_percentage 30529 1726882697.28268: done checking for max_fail_percentage 30529 1726882697.28269: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.28269: done checking to see if all hosts have failed 30529 1726882697.28270: getting the remaining hosts for this loop 30529 1726882697.28271: done getting the remaining hosts for this loop 30529 1726882697.28273: getting the next task for host managed_node1 30529 1726882697.28277: done getting next task for host managed_node1 30529 1726882697.28279: ^ task is: TASK: Include network role 30529 1726882697.28282: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.28284: getting variables 30529 1726882697.28285: in VariableManager get_vars() 30529 1726882697.28295: Calling all_inventory to load vars for managed_node1 30529 1726882697.28297: Calling groups_inventory to load vars for managed_node1 30529 1726882697.28298: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.28302: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.28303: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.28305: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.28914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.29745: done with get_vars() 30529 1726882697.29759: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.384) 0:01:51.324 ****** 30529 1726882697.29809: entering _queue_task() for managed_node1/include_role 30529 1726882697.30083: worker is 1 (out of 1 available) 30529 1726882697.30099: exiting _queue_task() for managed_node1/include_role 30529 1726882697.30112: done queuing things up, now waiting for results queue to drain 30529 1726882697.30114: waiting for pending results... 30529 1726882697.30308: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882697.30413: in run() - task 12673a56-9f93-b0f1-edc0-000000002142 30529 1726882697.30427: variable 'ansible_search_path' from source: unknown 30529 1726882697.30431: variable 'ansible_search_path' from source: unknown 30529 1726882697.30459: calling self._execute() 30529 1726882697.30534: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.30539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.30549: variable 'omit' from source: magic vars 30529 1726882697.30833: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.30844: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.30850: _execute() done 30529 1726882697.30853: dumping result to json 30529 1726882697.30855: done dumping result, returning 30529 1726882697.30861: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000002142] 30529 1726882697.30866: sending task result for task 12673a56-9f93-b0f1-edc0-000000002142 30529 1726882697.30973: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002142 30529 1726882697.30976: WORKER PROCESS EXITING 30529 1726882697.31013: no more pending results, returning what we have 30529 1726882697.31018: in VariableManager get_vars() 30529 1726882697.31062: Calling all_inventory to load vars for managed_node1 30529 1726882697.31064: Calling groups_inventory to load vars for managed_node1 30529 1726882697.31067: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.31078: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.31081: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.31084: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.31992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.32831: done with get_vars() 30529 1726882697.32844: variable 'ansible_search_path' from source: unknown 30529 1726882697.32845: variable 'ansible_search_path' from source: unknown 30529 1726882697.32933: variable 'omit' from source: magic vars 30529 1726882697.32960: variable 'omit' from source: magic vars 30529 1726882697.32968: variable 'omit' from source: magic vars 30529 1726882697.32971: we have included files to process 30529 1726882697.32971: generating all_blocks data 30529 1726882697.32973: done generating all_blocks data 30529 1726882697.32976: processing included file: fedora.linux_system_roles.network 30529 1726882697.32988: in VariableManager get_vars() 30529 1726882697.33000: done with get_vars() 30529 1726882697.33019: in VariableManager get_vars() 30529 1726882697.33029: done with get_vars() 30529 1726882697.33056: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882697.33130: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882697.33177: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882697.33434: in VariableManager get_vars() 30529 1726882697.33448: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882697.34629: iterating over new_blocks loaded from include file 30529 1726882697.34630: in VariableManager get_vars() 30529 1726882697.34642: done with get_vars() 30529 1726882697.34643: filtering new block on tags 30529 1726882697.34796: done filtering new block on tags 30529 1726882697.34799: in VariableManager get_vars() 30529 1726882697.34809: done with get_vars() 30529 1726882697.34810: filtering new block on tags 30529 1726882697.34821: done filtering new block on tags 30529 1726882697.34822: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882697.34826: extending task lists for all hosts with included blocks 30529 1726882697.34887: done extending task lists 30529 1726882697.34888: done processing included files 30529 1726882697.34889: results queue empty 30529 1726882697.34889: checking for any_errors_fatal 30529 1726882697.34890: done checking for any_errors_fatal 30529 1726882697.34891: checking for max_fail_percentage 30529 1726882697.34891: done checking for max_fail_percentage 30529 1726882697.34892: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.34894: done checking to see if all hosts have failed 30529 1726882697.34895: getting the remaining hosts for this loop 30529 1726882697.34896: done getting the remaining hosts for this loop 30529 1726882697.34897: getting the next task for host managed_node1 30529 1726882697.34900: done getting next task for host managed_node1 30529 1726882697.34902: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882697.34904: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.34911: getting variables 30529 1726882697.34912: in VariableManager get_vars() 30529 1726882697.34920: Calling all_inventory to load vars for managed_node1 30529 1726882697.34922: Calling groups_inventory to load vars for managed_node1 30529 1726882697.34923: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.34926: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.34928: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.34929: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.35601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.36444: done with get_vars() 30529 1726882697.36459: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:17 -0400 (0:00:00.066) 0:01:51.391 ****** 30529 1726882697.36510: entering _queue_task() for managed_node1/include_tasks 30529 1726882697.36764: worker is 1 (out of 1 available) 30529 1726882697.36776: exiting _queue_task() for managed_node1/include_tasks 30529 1726882697.36788: done queuing things up, now waiting for results queue to drain 30529 1726882697.36789: waiting for pending results... 30529 1726882697.36973: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882697.37067: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a4 30529 1726882697.37080: variable 'ansible_search_path' from source: unknown 30529 1726882697.37083: variable 'ansible_search_path' from source: unknown 30529 1726882697.37118: calling self._execute() 30529 1726882697.37192: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.37199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.37204: variable 'omit' from source: magic vars 30529 1726882697.37480: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.37494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.37498: _execute() done 30529 1726882697.37501: dumping result to json 30529 1726882697.37504: done dumping result, returning 30529 1726882697.37511: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-0000000024a4] 30529 1726882697.37516: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a4 30529 1726882697.37603: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a4 30529 1726882697.37607: WORKER PROCESS EXITING 30529 1726882697.37659: no more pending results, returning what we have 30529 1726882697.37664: in VariableManager get_vars() 30529 1726882697.37718: Calling all_inventory to load vars for managed_node1 30529 1726882697.37721: Calling groups_inventory to load vars for managed_node1 30529 1726882697.37723: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.37733: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.37735: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.37738: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.38520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.39383: done with get_vars() 30529 1726882697.39401: variable 'ansible_search_path' from source: unknown 30529 1726882697.39402: variable 'ansible_search_path' from source: unknown 30529 1726882697.39427: we have included files to process 30529 1726882697.39428: generating all_blocks data 30529 1726882697.39430: done generating all_blocks data 30529 1726882697.39431: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882697.39432: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882697.39433: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882697.39803: done processing included file 30529 1726882697.39805: iterating over new_blocks loaded from include file 30529 1726882697.39806: in VariableManager get_vars() 30529 1726882697.39822: done with get_vars() 30529 1726882697.39823: filtering new block on tags 30529 1726882697.39842: done filtering new block on tags 30529 1726882697.39843: in VariableManager get_vars() 30529 1726882697.39859: done with get_vars() 30529 1726882697.39860: filtering new block on tags 30529 1726882697.39885: done filtering new block on tags 30529 1726882697.39886: in VariableManager get_vars() 30529 1726882697.39906: done with get_vars() 30529 1726882697.39907: filtering new block on tags 30529 1726882697.39929: done filtering new block on tags 30529 1726882697.39931: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882697.39934: extending task lists for all hosts with included blocks 30529 1726882697.40873: done extending task lists 30529 1726882697.40874: done processing included files 30529 1726882697.40875: results queue empty 30529 1726882697.40875: checking for any_errors_fatal 30529 1726882697.40877: done checking for any_errors_fatal 30529 1726882697.40878: checking for max_fail_percentage 30529 1726882697.40878: done checking for max_fail_percentage 30529 1726882697.40879: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.40879: done checking to see if all hosts have failed 30529 1726882697.40880: getting the remaining hosts for this loop 30529 1726882697.40881: done getting the remaining hosts for this loop 30529 1726882697.40882: getting the next task for host managed_node1 30529 1726882697.40886: done getting next task for host managed_node1 30529 1726882697.40889: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882697.40892: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.40901: getting variables 30529 1726882697.40902: in VariableManager get_vars() 30529 1726882697.40911: Calling all_inventory to load vars for managed_node1 30529 1726882697.40913: Calling groups_inventory to load vars for managed_node1 30529 1726882697.40914: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.40917: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.40918: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.40920: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.41580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.42425: done with get_vars() 30529 1726882697.42439: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.059) 0:01:51.451 ****** 30529 1726882697.42492: entering _queue_task() for managed_node1/setup 30529 1726882697.42746: worker is 1 (out of 1 available) 30529 1726882697.42758: exiting _queue_task() for managed_node1/setup 30529 1726882697.42771: done queuing things up, now waiting for results queue to drain 30529 1726882697.42773: waiting for pending results... 30529 1726882697.42957: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882697.43059: in run() - task 12673a56-9f93-b0f1-edc0-0000000024fb 30529 1726882697.43072: variable 'ansible_search_path' from source: unknown 30529 1726882697.43076: variable 'ansible_search_path' from source: unknown 30529 1726882697.43107: calling self._execute() 30529 1726882697.43176: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.43180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.43191: variable 'omit' from source: magic vars 30529 1726882697.43459: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.43469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.43612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882697.45071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882697.45119: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882697.45144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882697.45169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882697.45192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882697.45247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882697.45268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882697.45285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882697.45316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882697.45327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882697.45363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882697.45378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882697.45397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882697.45425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882697.45436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882697.45542: variable '__network_required_facts' from source: role '' defaults 30529 1726882697.45549: variable 'ansible_facts' from source: unknown 30529 1726882697.46010: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882697.46013: when evaluation is False, skipping this task 30529 1726882697.46017: _execute() done 30529 1726882697.46019: dumping result to json 30529 1726882697.46021: done dumping result, returning 30529 1726882697.46028: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-0000000024fb] 30529 1726882697.46031: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fb 30529 1726882697.46116: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fb 30529 1726882697.46119: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882697.46159: no more pending results, returning what we have 30529 1726882697.46163: results queue empty 30529 1726882697.46164: checking for any_errors_fatal 30529 1726882697.46165: done checking for any_errors_fatal 30529 1726882697.46166: checking for max_fail_percentage 30529 1726882697.46167: done checking for max_fail_percentage 30529 1726882697.46168: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.46169: done checking to see if all hosts have failed 30529 1726882697.46169: getting the remaining hosts for this loop 30529 1726882697.46171: done getting the remaining hosts for this loop 30529 1726882697.46175: getting the next task for host managed_node1 30529 1726882697.46185: done getting next task for host managed_node1 30529 1726882697.46191: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882697.46198: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.46225: getting variables 30529 1726882697.46227: in VariableManager get_vars() 30529 1726882697.46271: Calling all_inventory to load vars for managed_node1 30529 1726882697.46273: Calling groups_inventory to load vars for managed_node1 30529 1726882697.46275: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.46287: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.46291: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.46305: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.47117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.48102: done with get_vars() 30529 1726882697.48118: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:17 -0400 (0:00:00.056) 0:01:51.507 ****** 30529 1726882697.48186: entering _queue_task() for managed_node1/stat 30529 1726882697.48427: worker is 1 (out of 1 available) 30529 1726882697.48441: exiting _queue_task() for managed_node1/stat 30529 1726882697.48452: done queuing things up, now waiting for results queue to drain 30529 1726882697.48454: waiting for pending results... 30529 1726882697.48632: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882697.48718: in run() - task 12673a56-9f93-b0f1-edc0-0000000024fd 30529 1726882697.48731: variable 'ansible_search_path' from source: unknown 30529 1726882697.48736: variable 'ansible_search_path' from source: unknown 30529 1726882697.48761: calling self._execute() 30529 1726882697.48839: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.48843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.48850: variable 'omit' from source: magic vars 30529 1726882697.49114: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.49124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.49239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882697.49428: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882697.49462: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882697.49486: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882697.49513: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882697.49576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882697.49596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882697.49615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882697.49632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882697.49702: variable '__network_is_ostree' from source: set_fact 30529 1726882697.49708: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882697.49711: when evaluation is False, skipping this task 30529 1726882697.49713: _execute() done 30529 1726882697.49716: dumping result to json 30529 1726882697.49720: done dumping result, returning 30529 1726882697.49727: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-0000000024fd] 30529 1726882697.49731: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fd 30529 1726882697.49816: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fd 30529 1726882697.49819: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882697.49865: no more pending results, returning what we have 30529 1726882697.49868: results queue empty 30529 1726882697.49869: checking for any_errors_fatal 30529 1726882697.49877: done checking for any_errors_fatal 30529 1726882697.49878: checking for max_fail_percentage 30529 1726882697.49880: done checking for max_fail_percentage 30529 1726882697.49880: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.49882: done checking to see if all hosts have failed 30529 1726882697.49882: getting the remaining hosts for this loop 30529 1726882697.49884: done getting the remaining hosts for this loop 30529 1726882697.49890: getting the next task for host managed_node1 30529 1726882697.49900: done getting next task for host managed_node1 30529 1726882697.49903: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882697.49908: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.49929: getting variables 30529 1726882697.49931: in VariableManager get_vars() 30529 1726882697.49969: Calling all_inventory to load vars for managed_node1 30529 1726882697.49971: Calling groups_inventory to load vars for managed_node1 30529 1726882697.49974: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.49982: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.49985: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.49990: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.50783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.51656: done with get_vars() 30529 1726882697.51672: done getting variables 30529 1726882697.51715: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:17 -0400 (0:00:00.035) 0:01:51.543 ****** 30529 1726882697.51742: entering _queue_task() for managed_node1/set_fact 30529 1726882697.51971: worker is 1 (out of 1 available) 30529 1726882697.51984: exiting _queue_task() for managed_node1/set_fact 30529 1726882697.52002: done queuing things up, now waiting for results queue to drain 30529 1726882697.52003: waiting for pending results... 30529 1726882697.52179: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882697.52264: in run() - task 12673a56-9f93-b0f1-edc0-0000000024fe 30529 1726882697.52277: variable 'ansible_search_path' from source: unknown 30529 1726882697.52283: variable 'ansible_search_path' from source: unknown 30529 1726882697.52314: calling self._execute() 30529 1726882697.52386: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.52394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.52402: variable 'omit' from source: magic vars 30529 1726882697.52673: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.52683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.52802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882697.52989: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882697.53022: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882697.53046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882697.53071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882697.53136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882697.53153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882697.53171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882697.53191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882697.53256: variable '__network_is_ostree' from source: set_fact 30529 1726882697.53262: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882697.53265: when evaluation is False, skipping this task 30529 1726882697.53267: _execute() done 30529 1726882697.53270: dumping result to json 30529 1726882697.53272: done dumping result, returning 30529 1726882697.53280: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-0000000024fe] 30529 1726882697.53284: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fe 30529 1726882697.53364: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024fe 30529 1726882697.53367: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882697.53418: no more pending results, returning what we have 30529 1726882697.53423: results queue empty 30529 1726882697.53423: checking for any_errors_fatal 30529 1726882697.53431: done checking for any_errors_fatal 30529 1726882697.53431: checking for max_fail_percentage 30529 1726882697.53433: done checking for max_fail_percentage 30529 1726882697.53434: checking to see if all hosts have failed and the running result is not ok 30529 1726882697.53435: done checking to see if all hosts have failed 30529 1726882697.53436: getting the remaining hosts for this loop 30529 1726882697.53438: done getting the remaining hosts for this loop 30529 1726882697.53441: getting the next task for host managed_node1 30529 1726882697.53453: done getting next task for host managed_node1 30529 1726882697.53457: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882697.53462: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882697.53486: getting variables 30529 1726882697.53490: in VariableManager get_vars() 30529 1726882697.53531: Calling all_inventory to load vars for managed_node1 30529 1726882697.53533: Calling groups_inventory to load vars for managed_node1 30529 1726882697.53536: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882697.53545: Calling all_plugins_play to load vars for managed_node1 30529 1726882697.53547: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882697.53550: Calling groups_plugins_play to load vars for managed_node1 30529 1726882697.54496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882697.55348: done with get_vars() 30529 1726882697.55365: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:17 -0400 (0:00:00.036) 0:01:51.580 ****** 30529 1726882697.55439: entering _queue_task() for managed_node1/service_facts 30529 1726882697.55697: worker is 1 (out of 1 available) 30529 1726882697.55711: exiting _queue_task() for managed_node1/service_facts 30529 1726882697.55725: done queuing things up, now waiting for results queue to drain 30529 1726882697.55726: waiting for pending results... 30529 1726882697.55934: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882697.56025: in run() - task 12673a56-9f93-b0f1-edc0-000000002500 30529 1726882697.56039: variable 'ansible_search_path' from source: unknown 30529 1726882697.56042: variable 'ansible_search_path' from source: unknown 30529 1726882697.56071: calling self._execute() 30529 1726882697.56145: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.56149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.56157: variable 'omit' from source: magic vars 30529 1726882697.56425: variable 'ansible_distribution_major_version' from source: facts 30529 1726882697.56435: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882697.56441: variable 'omit' from source: magic vars 30529 1726882697.56495: variable 'omit' from source: magic vars 30529 1726882697.56521: variable 'omit' from source: magic vars 30529 1726882697.56551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882697.56583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882697.56606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882697.56619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882697.56630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882697.56653: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882697.56656: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.56659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.56735: Set connection var ansible_shell_executable to /bin/sh 30529 1726882697.56739: Set connection var ansible_pipelining to False 30529 1726882697.56742: Set connection var ansible_shell_type to sh 30529 1726882697.56750: Set connection var ansible_timeout to 10 30529 1726882697.56752: Set connection var ansible_connection to ssh 30529 1726882697.56757: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882697.56773: variable 'ansible_shell_executable' from source: unknown 30529 1726882697.56776: variable 'ansible_connection' from source: unknown 30529 1726882697.56779: variable 'ansible_module_compression' from source: unknown 30529 1726882697.56782: variable 'ansible_shell_type' from source: unknown 30529 1726882697.56784: variable 'ansible_shell_executable' from source: unknown 30529 1726882697.56786: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882697.56790: variable 'ansible_pipelining' from source: unknown 30529 1726882697.56798: variable 'ansible_timeout' from source: unknown 30529 1726882697.56800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882697.56941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882697.56951: variable 'omit' from source: magic vars 30529 1726882697.56957: starting attempt loop 30529 1726882697.56960: running the handler 30529 1726882697.56972: _low_level_execute_command(): starting 30529 1726882697.56979: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882697.57496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.57500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.57503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882697.57506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.57551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.57554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882697.57556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.57612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.59282: stdout chunk (state=3): >>>/root <<< 30529 1726882697.59382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.59414: stderr chunk (state=3): >>><<< 30529 1726882697.59417: stdout chunk (state=3): >>><<< 30529 1726882697.59438: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882697.59449: _low_level_execute_command(): starting 30529 1726882697.59454: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526 `" && echo ansible-tmp-1726882697.5943775-35654-112814946429526="` echo /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526 `" ) && sleep 0' 30529 1726882697.59882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.59886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.59898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.59900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.59947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.59950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882697.59954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.60003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.61835: stdout chunk (state=3): >>>ansible-tmp-1726882697.5943775-35654-112814946429526=/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526 <<< 30529 1726882697.61947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.61971: stderr chunk (state=3): >>><<< 30529 1726882697.61974: stdout chunk (state=3): >>><<< 30529 1726882697.61985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882697.5943775-35654-112814946429526=/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882697.62027: variable 'ansible_module_compression' from source: unknown 30529 1726882697.62061: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882697.62097: variable 'ansible_facts' from source: unknown 30529 1726882697.62154: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py 30529 1726882697.62259: Sending initial data 30529 1726882697.62263: Sent initial data (162 bytes) 30529 1726882697.62660: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.62664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.62677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.62731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882697.62735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.62779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.64296: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882697.64303: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882697.64334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882697.64376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmphg3636iy /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py <<< 30529 1726882697.64378: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py" <<< 30529 1726882697.64420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmphg3636iy" to remote "/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py" <<< 30529 1726882697.64947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.64983: stderr chunk (state=3): >>><<< 30529 1726882697.64986: stdout chunk (state=3): >>><<< 30529 1726882697.65003: done transferring module to remote 30529 1726882697.65012: _low_level_execute_command(): starting 30529 1726882697.65015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/ /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py && sleep 0' 30529 1726882697.65434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.65437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882697.65440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.65442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.65447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.65507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882697.65515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.65545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882697.67246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882697.67268: stderr chunk (state=3): >>><<< 30529 1726882697.67271: stdout chunk (state=3): >>><<< 30529 1726882697.67285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882697.67290: _low_level_execute_command(): starting 30529 1726882697.67294: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/AnsiballZ_service_facts.py && sleep 0' 30529 1726882697.67689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882697.67692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.67697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882697.67699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882697.67701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882697.67749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882697.67754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882697.67802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.17878: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882699.17909: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30529 1726882699.17930: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882699.19421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882699.19452: stderr chunk (state=3): >>><<< 30529 1726882699.19455: stdout chunk (state=3): >>><<< 30529 1726882699.19485: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882699.19937: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882699.19945: _low_level_execute_command(): starting 30529 1726882699.19950: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882697.5943775-35654-112814946429526/ > /dev/null 2>&1 && sleep 0' 30529 1726882699.20406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882699.20409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882699.20412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.20414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.20417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.20464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.20467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882699.20473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.20514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.22272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.22297: stderr chunk (state=3): >>><<< 30529 1726882699.22301: stdout chunk (state=3): >>><<< 30529 1726882699.22312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882699.22317: handler run complete 30529 1726882699.22429: variable 'ansible_facts' from source: unknown 30529 1726882699.22524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.22800: variable 'ansible_facts' from source: unknown 30529 1726882699.22878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.22995: attempt loop complete, returning result 30529 1726882699.23003: _execute() done 30529 1726882699.23005: dumping result to json 30529 1726882699.23037: done dumping result, returning 30529 1726882699.23045: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-000000002500] 30529 1726882699.23050: sending task result for task 12673a56-9f93-b0f1-edc0-000000002500 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882699.23683: no more pending results, returning what we have 30529 1726882699.23686: results queue empty 30529 1726882699.23689: checking for any_errors_fatal 30529 1726882699.23696: done checking for any_errors_fatal 30529 1726882699.23696: checking for max_fail_percentage 30529 1726882699.23698: done checking for max_fail_percentage 30529 1726882699.23699: checking to see if all hosts have failed and the running result is not ok 30529 1726882699.23699: done checking to see if all hosts have failed 30529 1726882699.23700: getting the remaining hosts for this loop 30529 1726882699.23701: done getting the remaining hosts for this loop 30529 1726882699.23705: getting the next task for host managed_node1 30529 1726882699.23711: done getting next task for host managed_node1 30529 1726882699.23714: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882699.23718: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882699.23730: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002500 30529 1726882699.23733: WORKER PROCESS EXITING 30529 1726882699.23741: getting variables 30529 1726882699.23742: in VariableManager get_vars() 30529 1726882699.23769: Calling all_inventory to load vars for managed_node1 30529 1726882699.23770: Calling groups_inventory to load vars for managed_node1 30529 1726882699.23772: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882699.23778: Calling all_plugins_play to load vars for managed_node1 30529 1726882699.23780: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882699.23785: Calling groups_plugins_play to load vars for managed_node1 30529 1726882699.24608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.25480: done with get_vars() 30529 1726882699.25500: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:19 -0400 (0:00:01.701) 0:01:53.281 ****** 30529 1726882699.25567: entering _queue_task() for managed_node1/package_facts 30529 1726882699.25805: worker is 1 (out of 1 available) 30529 1726882699.25818: exiting _queue_task() for managed_node1/package_facts 30529 1726882699.25830: done queuing things up, now waiting for results queue to drain 30529 1726882699.25833: waiting for pending results... 30529 1726882699.26020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882699.26121: in run() - task 12673a56-9f93-b0f1-edc0-000000002501 30529 1726882699.26139: variable 'ansible_search_path' from source: unknown 30529 1726882699.26143: variable 'ansible_search_path' from source: unknown 30529 1726882699.26165: calling self._execute() 30529 1726882699.26247: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.26250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.26259: variable 'omit' from source: magic vars 30529 1726882699.26547: variable 'ansible_distribution_major_version' from source: facts 30529 1726882699.26557: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882699.26562: variable 'omit' from source: magic vars 30529 1726882699.26621: variable 'omit' from source: magic vars 30529 1726882699.26643: variable 'omit' from source: magic vars 30529 1726882699.26674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882699.26708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882699.26726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882699.26738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882699.26748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882699.26771: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882699.26775: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.26777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.26852: Set connection var ansible_shell_executable to /bin/sh 30529 1726882699.26856: Set connection var ansible_pipelining to False 30529 1726882699.26858: Set connection var ansible_shell_type to sh 30529 1726882699.26866: Set connection var ansible_timeout to 10 30529 1726882699.26869: Set connection var ansible_connection to ssh 30529 1726882699.26874: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882699.26892: variable 'ansible_shell_executable' from source: unknown 30529 1726882699.26897: variable 'ansible_connection' from source: unknown 30529 1726882699.26899: variable 'ansible_module_compression' from source: unknown 30529 1726882699.26902: variable 'ansible_shell_type' from source: unknown 30529 1726882699.26904: variable 'ansible_shell_executable' from source: unknown 30529 1726882699.26907: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.26912: variable 'ansible_pipelining' from source: unknown 30529 1726882699.26915: variable 'ansible_timeout' from source: unknown 30529 1726882699.26918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.27056: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882699.27065: variable 'omit' from source: magic vars 30529 1726882699.27071: starting attempt loop 30529 1726882699.27073: running the handler 30529 1726882699.27085: _low_level_execute_command(): starting 30529 1726882699.27096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882699.27601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.27605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882699.27609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.27659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.27662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882699.27669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.27715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.29283: stdout chunk (state=3): >>>/root <<< 30529 1726882699.29380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.29414: stderr chunk (state=3): >>><<< 30529 1726882699.29418: stdout chunk (state=3): >>><<< 30529 1726882699.29440: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882699.29450: _low_level_execute_command(): starting 30529 1726882699.29455: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237 `" && echo ansible-tmp-1726882699.294381-35678-227766106140237="` echo /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237 `" ) && sleep 0' 30529 1726882699.29901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.29904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.29907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882699.29916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882699.29919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.29964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.29971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882699.29973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.30015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.31857: stdout chunk (state=3): >>>ansible-tmp-1726882699.294381-35678-227766106140237=/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237 <<< 30529 1726882699.31960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.31985: stderr chunk (state=3): >>><<< 30529 1726882699.31988: stdout chunk (state=3): >>><<< 30529 1726882699.32010: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882699.294381-35678-227766106140237=/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882699.32046: variable 'ansible_module_compression' from source: unknown 30529 1726882699.32083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882699.32139: variable 'ansible_facts' from source: unknown 30529 1726882699.32255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py 30529 1726882699.32359: Sending initial data 30529 1726882699.32362: Sent initial data (161 bytes) 30529 1726882699.32802: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.32805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.32808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882699.32809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882699.32811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.32857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.32861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.32913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.34418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882699.34422: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882699.34452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882699.34498: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_33j4jt2 /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py <<< 30529 1726882699.34504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py" <<< 30529 1726882699.34539: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp_33j4jt2" to remote "/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py" <<< 30529 1726882699.35536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.35572: stderr chunk (state=3): >>><<< 30529 1726882699.35575: stdout chunk (state=3): >>><<< 30529 1726882699.35618: done transferring module to remote 30529 1726882699.35626: _low_level_execute_command(): starting 30529 1726882699.35630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/ /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py && sleep 0' 30529 1726882699.36051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.36054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882699.36056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.36059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882699.36065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.36112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.36118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.36158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.37864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.37896: stderr chunk (state=3): >>><<< 30529 1726882699.37899: stdout chunk (state=3): >>><<< 30529 1726882699.37906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882699.37909: _low_level_execute_command(): starting 30529 1726882699.37913: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/AnsiballZ_package_facts.py && sleep 0' 30529 1726882699.38326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882699.38329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882699.38332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.38334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882699.38336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.38381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.38385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.38437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.81968: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30529 1726882699.81994: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30529 1726882699.82009: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30529 1726882699.82144: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30529 1726882699.82153: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882699.82156: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30529 1726882699.82162: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882699.83863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882699.83887: stderr chunk (state=3): >>><<< 30529 1726882699.83890: stdout chunk (state=3): >>><<< 30529 1726882699.83936: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882699.85239: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882699.85255: _low_level_execute_command(): starting 30529 1726882699.85264: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882699.294381-35678-227766106140237/ > /dev/null 2>&1 && sleep 0' 30529 1726882699.85700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882699.85705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.85722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882699.85774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882699.85777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882699.85779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882699.85830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882699.87631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882699.87660: stderr chunk (state=3): >>><<< 30529 1726882699.87664: stdout chunk (state=3): >>><<< 30529 1726882699.87676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882699.87681: handler run complete 30529 1726882699.88158: variable 'ansible_facts' from source: unknown 30529 1726882699.88435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.89484: variable 'ansible_facts' from source: unknown 30529 1726882699.89723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.90095: attempt loop complete, returning result 30529 1726882699.90103: _execute() done 30529 1726882699.90105: dumping result to json 30529 1726882699.90218: done dumping result, returning 30529 1726882699.90225: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-000000002501] 30529 1726882699.90228: sending task result for task 12673a56-9f93-b0f1-edc0-000000002501 30529 1726882699.91567: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002501 30529 1726882699.91570: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882699.91673: no more pending results, returning what we have 30529 1726882699.91678: results queue empty 30529 1726882699.91679: checking for any_errors_fatal 30529 1726882699.91684: done checking for any_errors_fatal 30529 1726882699.91684: checking for max_fail_percentage 30529 1726882699.91685: done checking for max_fail_percentage 30529 1726882699.91686: checking to see if all hosts have failed and the running result is not ok 30529 1726882699.91686: done checking to see if all hosts have failed 30529 1726882699.91687: getting the remaining hosts for this loop 30529 1726882699.91688: done getting the remaining hosts for this loop 30529 1726882699.91691: getting the next task for host managed_node1 30529 1726882699.91698: done getting next task for host managed_node1 30529 1726882699.91700: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882699.91704: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882699.91713: getting variables 30529 1726882699.91714: in VariableManager get_vars() 30529 1726882699.91741: Calling all_inventory to load vars for managed_node1 30529 1726882699.91743: Calling groups_inventory to load vars for managed_node1 30529 1726882699.91744: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882699.91751: Calling all_plugins_play to load vars for managed_node1 30529 1726882699.91753: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882699.91754: Calling groups_plugins_play to load vars for managed_node1 30529 1726882699.92429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.93354: done with get_vars() 30529 1726882699.93372: done getting variables 30529 1726882699.93418: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:19 -0400 (0:00:00.678) 0:01:53.960 ****** 30529 1726882699.93447: entering _queue_task() for managed_node1/debug 30529 1726882699.93685: worker is 1 (out of 1 available) 30529 1726882699.93699: exiting _queue_task() for managed_node1/debug 30529 1726882699.93713: done queuing things up, now waiting for results queue to drain 30529 1726882699.93715: waiting for pending results... 30529 1726882699.93903: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882699.94003: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a5 30529 1726882699.94018: variable 'ansible_search_path' from source: unknown 30529 1726882699.94021: variable 'ansible_search_path' from source: unknown 30529 1726882699.94052: calling self._execute() 30529 1726882699.94128: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.94132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.94140: variable 'omit' from source: magic vars 30529 1726882699.94425: variable 'ansible_distribution_major_version' from source: facts 30529 1726882699.94435: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882699.94440: variable 'omit' from source: magic vars 30529 1726882699.94477: variable 'omit' from source: magic vars 30529 1726882699.94549: variable 'network_provider' from source: set_fact 30529 1726882699.94562: variable 'omit' from source: magic vars 30529 1726882699.94594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882699.94626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882699.94643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882699.94658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882699.94667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882699.94694: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882699.94698: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.94700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.94769: Set connection var ansible_shell_executable to /bin/sh 30529 1726882699.94772: Set connection var ansible_pipelining to False 30529 1726882699.94775: Set connection var ansible_shell_type to sh 30529 1726882699.94783: Set connection var ansible_timeout to 10 30529 1726882699.94785: Set connection var ansible_connection to ssh 30529 1726882699.94792: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882699.94808: variable 'ansible_shell_executable' from source: unknown 30529 1726882699.94813: variable 'ansible_connection' from source: unknown 30529 1726882699.94817: variable 'ansible_module_compression' from source: unknown 30529 1726882699.94819: variable 'ansible_shell_type' from source: unknown 30529 1726882699.94822: variable 'ansible_shell_executable' from source: unknown 30529 1726882699.94824: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.94826: variable 'ansible_pipelining' from source: unknown 30529 1726882699.94828: variable 'ansible_timeout' from source: unknown 30529 1726882699.94830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.94928: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882699.94938: variable 'omit' from source: magic vars 30529 1726882699.94943: starting attempt loop 30529 1726882699.94946: running the handler 30529 1726882699.94981: handler run complete 30529 1726882699.94991: attempt loop complete, returning result 30529 1726882699.94999: _execute() done 30529 1726882699.95002: dumping result to json 30529 1726882699.95005: done dumping result, returning 30529 1726882699.95011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-0000000024a5] 30529 1726882699.95015: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a5 30529 1726882699.95095: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a5 30529 1726882699.95098: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882699.95163: no more pending results, returning what we have 30529 1726882699.95166: results queue empty 30529 1726882699.95167: checking for any_errors_fatal 30529 1726882699.95173: done checking for any_errors_fatal 30529 1726882699.95174: checking for max_fail_percentage 30529 1726882699.95175: done checking for max_fail_percentage 30529 1726882699.95176: checking to see if all hosts have failed and the running result is not ok 30529 1726882699.95177: done checking to see if all hosts have failed 30529 1726882699.95178: getting the remaining hosts for this loop 30529 1726882699.95179: done getting the remaining hosts for this loop 30529 1726882699.95182: getting the next task for host managed_node1 30529 1726882699.95192: done getting next task for host managed_node1 30529 1726882699.95197: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882699.95202: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882699.95213: getting variables 30529 1726882699.95216: in VariableManager get_vars() 30529 1726882699.95251: Calling all_inventory to load vars for managed_node1 30529 1726882699.95254: Calling groups_inventory to load vars for managed_node1 30529 1726882699.95256: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882699.95264: Calling all_plugins_play to load vars for managed_node1 30529 1726882699.95267: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882699.95269: Calling groups_plugins_play to load vars for managed_node1 30529 1726882699.96028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.96885: done with get_vars() 30529 1726882699.96904: done getting variables 30529 1726882699.96945: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:19 -0400 (0:00:00.035) 0:01:53.995 ****** 30529 1726882699.96972: entering _queue_task() for managed_node1/fail 30529 1726882699.97190: worker is 1 (out of 1 available) 30529 1726882699.97205: exiting _queue_task() for managed_node1/fail 30529 1726882699.97220: done queuing things up, now waiting for results queue to drain 30529 1726882699.97222: waiting for pending results... 30529 1726882699.97391: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882699.97484: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a6 30529 1726882699.97498: variable 'ansible_search_path' from source: unknown 30529 1726882699.97502: variable 'ansible_search_path' from source: unknown 30529 1726882699.97528: calling self._execute() 30529 1726882699.97599: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882699.97603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882699.97611: variable 'omit' from source: magic vars 30529 1726882699.97875: variable 'ansible_distribution_major_version' from source: facts 30529 1726882699.97885: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882699.97968: variable 'network_state' from source: role '' defaults 30529 1726882699.97977: Evaluated conditional (network_state != {}): False 30529 1726882699.97981: when evaluation is False, skipping this task 30529 1726882699.97983: _execute() done 30529 1726882699.97990: dumping result to json 30529 1726882699.97996: done dumping result, returning 30529 1726882699.98000: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-0000000024a6] 30529 1726882699.98003: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a6 30529 1726882699.98084: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a6 30529 1726882699.98090: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882699.98154: no more pending results, returning what we have 30529 1726882699.98157: results queue empty 30529 1726882699.98158: checking for any_errors_fatal 30529 1726882699.98164: done checking for any_errors_fatal 30529 1726882699.98165: checking for max_fail_percentage 30529 1726882699.98166: done checking for max_fail_percentage 30529 1726882699.98167: checking to see if all hosts have failed and the running result is not ok 30529 1726882699.98167: done checking to see if all hosts have failed 30529 1726882699.98168: getting the remaining hosts for this loop 30529 1726882699.98170: done getting the remaining hosts for this loop 30529 1726882699.98173: getting the next task for host managed_node1 30529 1726882699.98180: done getting next task for host managed_node1 30529 1726882699.98183: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882699.98189: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882699.98212: getting variables 30529 1726882699.98213: in VariableManager get_vars() 30529 1726882699.98247: Calling all_inventory to load vars for managed_node1 30529 1726882699.98249: Calling groups_inventory to load vars for managed_node1 30529 1726882699.98251: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882699.98259: Calling all_plugins_play to load vars for managed_node1 30529 1726882699.98261: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882699.98264: Calling groups_plugins_play to load vars for managed_node1 30529 1726882699.99132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882699.99966: done with get_vars() 30529 1726882699.99981: done getting variables 30529 1726882700.00025: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:19 -0400 (0:00:00.030) 0:01:54.026 ****** 30529 1726882700.00048: entering _queue_task() for managed_node1/fail 30529 1726882700.00251: worker is 1 (out of 1 available) 30529 1726882700.00266: exiting _queue_task() for managed_node1/fail 30529 1726882700.00279: done queuing things up, now waiting for results queue to drain 30529 1726882700.00281: waiting for pending results... 30529 1726882700.00453: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882700.00549: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a7 30529 1726882700.00561: variable 'ansible_search_path' from source: unknown 30529 1726882700.00565: variable 'ansible_search_path' from source: unknown 30529 1726882700.00595: calling self._execute() 30529 1726882700.00664: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.00668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.00675: variable 'omit' from source: magic vars 30529 1726882700.00938: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.00947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.01025: variable 'network_state' from source: role '' defaults 30529 1726882700.01036: Evaluated conditional (network_state != {}): False 30529 1726882700.01039: when evaluation is False, skipping this task 30529 1726882700.01041: _execute() done 30529 1726882700.01044: dumping result to json 30529 1726882700.01046: done dumping result, returning 30529 1726882700.01057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-0000000024a7] 30529 1726882700.01059: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a7 30529 1726882700.01144: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a7 30529 1726882700.01147: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882700.01205: no more pending results, returning what we have 30529 1726882700.01208: results queue empty 30529 1726882700.01209: checking for any_errors_fatal 30529 1726882700.01216: done checking for any_errors_fatal 30529 1726882700.01217: checking for max_fail_percentage 30529 1726882700.01219: done checking for max_fail_percentage 30529 1726882700.01219: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.01220: done checking to see if all hosts have failed 30529 1726882700.01221: getting the remaining hosts for this loop 30529 1726882700.01222: done getting the remaining hosts for this loop 30529 1726882700.01225: getting the next task for host managed_node1 30529 1726882700.01232: done getting next task for host managed_node1 30529 1726882700.01236: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882700.01240: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.01261: getting variables 30529 1726882700.01263: in VariableManager get_vars() 30529 1726882700.01300: Calling all_inventory to load vars for managed_node1 30529 1726882700.01302: Calling groups_inventory to load vars for managed_node1 30529 1726882700.01304: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.01312: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.01314: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.01317: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.02049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.02906: done with get_vars() 30529 1726882700.02921: done getting variables 30529 1726882700.02959: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:20 -0400 (0:00:00.029) 0:01:54.055 ****** 30529 1726882700.02982: entering _queue_task() for managed_node1/fail 30529 1726882700.03184: worker is 1 (out of 1 available) 30529 1726882700.03201: exiting _queue_task() for managed_node1/fail 30529 1726882700.03215: done queuing things up, now waiting for results queue to drain 30529 1726882700.03216: waiting for pending results... 30529 1726882700.03383: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882700.03476: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a8 30529 1726882700.03491: variable 'ansible_search_path' from source: unknown 30529 1726882700.03497: variable 'ansible_search_path' from source: unknown 30529 1726882700.03522: calling self._execute() 30529 1726882700.03664: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.03669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.03672: variable 'omit' from source: magic vars 30529 1726882700.03866: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.03880: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.03995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.05705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.05748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.05774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.05812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.05837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.05890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.05910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.05928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.05956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.05967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.06031: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.06043: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882700.06120: variable 'ansible_distribution' from source: facts 30529 1726882700.06123: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.06130: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882700.06285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.06305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.06322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.06346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.06356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.06394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.06410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.06427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.06451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.06461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.06490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.06512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.06528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.06551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.06561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.06749: variable 'network_connections' from source: include params 30529 1726882700.06757: variable 'interface' from source: play vars 30529 1726882700.06805: variable 'interface' from source: play vars 30529 1726882700.06817: variable 'network_state' from source: role '' defaults 30529 1726882700.06861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.06970: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.07001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.07024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.07047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.07077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.07096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.07117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.07136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.07155: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882700.07158: when evaluation is False, skipping this task 30529 1726882700.07161: _execute() done 30529 1726882700.07163: dumping result to json 30529 1726882700.07165: done dumping result, returning 30529 1726882700.07172: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-0000000024a8] 30529 1726882700.07175: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a8 30529 1726882700.07260: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a8 30529 1726882700.07263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882700.07309: no more pending results, returning what we have 30529 1726882700.07312: results queue empty 30529 1726882700.07313: checking for any_errors_fatal 30529 1726882700.07319: done checking for any_errors_fatal 30529 1726882700.07320: checking for max_fail_percentage 30529 1726882700.07321: done checking for max_fail_percentage 30529 1726882700.07322: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.07323: done checking to see if all hosts have failed 30529 1726882700.07324: getting the remaining hosts for this loop 30529 1726882700.07326: done getting the remaining hosts for this loop 30529 1726882700.07329: getting the next task for host managed_node1 30529 1726882700.07337: done getting next task for host managed_node1 30529 1726882700.07341: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882700.07346: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.07372: getting variables 30529 1726882700.07374: in VariableManager get_vars() 30529 1726882700.07421: Calling all_inventory to load vars for managed_node1 30529 1726882700.07424: Calling groups_inventory to load vars for managed_node1 30529 1726882700.07426: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.07435: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.07437: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.07440: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.08341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.09181: done with get_vars() 30529 1726882700.09199: done getting variables 30529 1726882700.09240: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:20 -0400 (0:00:00.062) 0:01:54.118 ****** 30529 1726882700.09263: entering _queue_task() for managed_node1/dnf 30529 1726882700.09481: worker is 1 (out of 1 available) 30529 1726882700.09495: exiting _queue_task() for managed_node1/dnf 30529 1726882700.09509: done queuing things up, now waiting for results queue to drain 30529 1726882700.09511: waiting for pending results... 30529 1726882700.09696: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882700.09799: in run() - task 12673a56-9f93-b0f1-edc0-0000000024a9 30529 1726882700.09812: variable 'ansible_search_path' from source: unknown 30529 1726882700.09816: variable 'ansible_search_path' from source: unknown 30529 1726882700.09844: calling self._execute() 30529 1726882700.09919: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.09923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.09932: variable 'omit' from source: magic vars 30529 1726882700.10202: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.10212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.10347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.11844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.11899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.11927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.11952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.11971: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.12033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.12054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.12071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.12101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.12114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.12191: variable 'ansible_distribution' from source: facts 30529 1726882700.12199: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.12213: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882700.12284: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.12369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.12386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.12407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.12433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.12445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.12472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.12487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.12507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.12530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.12540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.12570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.12585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.12606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.12629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.12639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.12740: variable 'network_connections' from source: include params 30529 1726882700.12750: variable 'interface' from source: play vars 30529 1726882700.12798: variable 'interface' from source: play vars 30529 1726882700.12847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.12956: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.12987: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.13010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.13031: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.13071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.13089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.13114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.13132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.13166: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882700.13324: variable 'network_connections' from source: include params 30529 1726882700.13327: variable 'interface' from source: play vars 30529 1726882700.13367: variable 'interface' from source: play vars 30529 1726882700.13385: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882700.13388: when evaluation is False, skipping this task 30529 1726882700.13394: _execute() done 30529 1726882700.13397: dumping result to json 30529 1726882700.13399: done dumping result, returning 30529 1726882700.13407: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000024a9] 30529 1726882700.13411: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a9 30529 1726882700.13496: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024a9 30529 1726882700.13498: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882700.13583: no more pending results, returning what we have 30529 1726882700.13586: results queue empty 30529 1726882700.13587: checking for any_errors_fatal 30529 1726882700.13596: done checking for any_errors_fatal 30529 1726882700.13597: checking for max_fail_percentage 30529 1726882700.13598: done checking for max_fail_percentage 30529 1726882700.13599: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.13600: done checking to see if all hosts have failed 30529 1726882700.13601: getting the remaining hosts for this loop 30529 1726882700.13603: done getting the remaining hosts for this loop 30529 1726882700.13606: getting the next task for host managed_node1 30529 1726882700.13615: done getting next task for host managed_node1 30529 1726882700.13618: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882700.13622: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.13645: getting variables 30529 1726882700.13647: in VariableManager get_vars() 30529 1726882700.13686: Calling all_inventory to load vars for managed_node1 30529 1726882700.13688: Calling groups_inventory to load vars for managed_node1 30529 1726882700.13690: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.13704: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.13707: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.13709: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.14512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.15483: done with get_vars() 30529 1726882700.15503: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882700.15554: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:20 -0400 (0:00:00.063) 0:01:54.181 ****** 30529 1726882700.15577: entering _queue_task() for managed_node1/yum 30529 1726882700.15807: worker is 1 (out of 1 available) 30529 1726882700.15821: exiting _queue_task() for managed_node1/yum 30529 1726882700.15832: done queuing things up, now waiting for results queue to drain 30529 1726882700.15834: waiting for pending results... 30529 1726882700.16020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882700.16102: in run() - task 12673a56-9f93-b0f1-edc0-0000000024aa 30529 1726882700.16114: variable 'ansible_search_path' from source: unknown 30529 1726882700.16117: variable 'ansible_search_path' from source: unknown 30529 1726882700.16144: calling self._execute() 30529 1726882700.16224: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.16228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.16236: variable 'omit' from source: magic vars 30529 1726882700.16513: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.16522: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.16638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.23009: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.23050: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.23075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.23099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.23129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.23176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.23197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.23215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.23246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.23254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.23319: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.23330: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882700.23333: when evaluation is False, skipping this task 30529 1726882700.23335: _execute() done 30529 1726882700.23338: dumping result to json 30529 1726882700.23340: done dumping result, returning 30529 1726882700.23349: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000024aa] 30529 1726882700.23352: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024aa 30529 1726882700.23437: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024aa 30529 1726882700.23440: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882700.23510: no more pending results, returning what we have 30529 1726882700.23513: results queue empty 30529 1726882700.23514: checking for any_errors_fatal 30529 1726882700.23520: done checking for any_errors_fatal 30529 1726882700.23520: checking for max_fail_percentage 30529 1726882700.23522: done checking for max_fail_percentage 30529 1726882700.23523: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.23524: done checking to see if all hosts have failed 30529 1726882700.23525: getting the remaining hosts for this loop 30529 1726882700.23526: done getting the remaining hosts for this loop 30529 1726882700.23530: getting the next task for host managed_node1 30529 1726882700.23537: done getting next task for host managed_node1 30529 1726882700.23540: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882700.23544: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.23568: getting variables 30529 1726882700.23571: in VariableManager get_vars() 30529 1726882700.23614: Calling all_inventory to load vars for managed_node1 30529 1726882700.23617: Calling groups_inventory to load vars for managed_node1 30529 1726882700.23619: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.23628: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.23630: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.23633: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.29296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.30135: done with get_vars() 30529 1726882700.30152: done getting variables 30529 1726882700.30186: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:20 -0400 (0:00:00.146) 0:01:54.328 ****** 30529 1726882700.30213: entering _queue_task() for managed_node1/fail 30529 1726882700.30492: worker is 1 (out of 1 available) 30529 1726882700.30508: exiting _queue_task() for managed_node1/fail 30529 1726882700.30520: done queuing things up, now waiting for results queue to drain 30529 1726882700.30523: waiting for pending results... 30529 1726882700.30723: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882700.30830: in run() - task 12673a56-9f93-b0f1-edc0-0000000024ab 30529 1726882700.30843: variable 'ansible_search_path' from source: unknown 30529 1726882700.30847: variable 'ansible_search_path' from source: unknown 30529 1726882700.30877: calling self._execute() 30529 1726882700.30959: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.30965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.30975: variable 'omit' from source: magic vars 30529 1726882700.31259: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.31269: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.31361: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.31497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.33015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.33069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.33098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.33124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.33146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.33206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.33228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.33245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.33274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.33284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.33320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.33337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.33353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.33380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.33395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.33423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.33439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.33454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.33478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.33498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.33608: variable 'network_connections' from source: include params 30529 1726882700.33619: variable 'interface' from source: play vars 30529 1726882700.33665: variable 'interface' from source: play vars 30529 1726882700.33720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.33835: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.33863: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.33885: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.33911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.33943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.33958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.33974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.34030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.34033: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882700.34183: variable 'network_connections' from source: include params 30529 1726882700.34186: variable 'interface' from source: play vars 30529 1726882700.34230: variable 'interface' from source: play vars 30529 1726882700.34250: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882700.34254: when evaluation is False, skipping this task 30529 1726882700.34256: _execute() done 30529 1726882700.34259: dumping result to json 30529 1726882700.34261: done dumping result, returning 30529 1726882700.34267: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000024ab] 30529 1726882700.34272: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ab 30529 1726882700.34364: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ab 30529 1726882700.34366: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882700.34419: no more pending results, returning what we have 30529 1726882700.34423: results queue empty 30529 1726882700.34424: checking for any_errors_fatal 30529 1726882700.34433: done checking for any_errors_fatal 30529 1726882700.34433: checking for max_fail_percentage 30529 1726882700.34435: done checking for max_fail_percentage 30529 1726882700.34436: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.34437: done checking to see if all hosts have failed 30529 1726882700.34437: getting the remaining hosts for this loop 30529 1726882700.34439: done getting the remaining hosts for this loop 30529 1726882700.34442: getting the next task for host managed_node1 30529 1726882700.34451: done getting next task for host managed_node1 30529 1726882700.34455: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882700.34460: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.34486: getting variables 30529 1726882700.34490: in VariableManager get_vars() 30529 1726882700.34534: Calling all_inventory to load vars for managed_node1 30529 1726882700.34536: Calling groups_inventory to load vars for managed_node1 30529 1726882700.34539: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.34548: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.34550: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.34553: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.35376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.36348: done with get_vars() 30529 1726882700.36363: done getting variables 30529 1726882700.36407: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:20 -0400 (0:00:00.062) 0:01:54.390 ****** 30529 1726882700.36434: entering _queue_task() for managed_node1/package 30529 1726882700.36660: worker is 1 (out of 1 available) 30529 1726882700.36673: exiting _queue_task() for managed_node1/package 30529 1726882700.36686: done queuing things up, now waiting for results queue to drain 30529 1726882700.36687: waiting for pending results... 30529 1726882700.36883: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882700.36988: in run() - task 12673a56-9f93-b0f1-edc0-0000000024ac 30529 1726882700.37002: variable 'ansible_search_path' from source: unknown 30529 1726882700.37005: variable 'ansible_search_path' from source: unknown 30529 1726882700.37035: calling self._execute() 30529 1726882700.37115: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.37119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.37131: variable 'omit' from source: magic vars 30529 1726882700.37407: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.37416: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.37546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.37736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.37768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.37797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.37851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.37933: variable 'network_packages' from source: role '' defaults 30529 1726882700.38007: variable '__network_provider_setup' from source: role '' defaults 30529 1726882700.38016: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882700.38060: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882700.38068: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882700.38115: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882700.38231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.39562: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.39608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.39637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.39661: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.39680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.39752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.39771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.39788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.39818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.39829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.39862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.39878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.39899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.39924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.39934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.40081: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882700.40154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.40173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.40189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.40218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.40228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.40290: variable 'ansible_python' from source: facts 30529 1726882700.40306: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882700.40360: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882700.40420: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882700.40502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.40518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.40535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.40558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.40568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.40604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.40623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.40639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.40662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.40673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.40770: variable 'network_connections' from source: include params 30529 1726882700.40774: variable 'interface' from source: play vars 30529 1726882700.40849: variable 'interface' from source: play vars 30529 1726882700.40901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.40921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.40943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.40964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.41005: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.41181: variable 'network_connections' from source: include params 30529 1726882700.41184: variable 'interface' from source: play vars 30529 1726882700.41259: variable 'interface' from source: play vars 30529 1726882700.41281: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882700.41336: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.41530: variable 'network_connections' from source: include params 30529 1726882700.41533: variable 'interface' from source: play vars 30529 1726882700.41578: variable 'interface' from source: play vars 30529 1726882700.41598: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882700.41650: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882700.41841: variable 'network_connections' from source: include params 30529 1726882700.41844: variable 'interface' from source: play vars 30529 1726882700.41887: variable 'interface' from source: play vars 30529 1726882700.41930: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882700.41971: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882700.41976: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882700.42022: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882700.42155: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882700.42443: variable 'network_connections' from source: include params 30529 1726882700.42448: variable 'interface' from source: play vars 30529 1726882700.42489: variable 'interface' from source: play vars 30529 1726882700.42499: variable 'ansible_distribution' from source: facts 30529 1726882700.42502: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.42507: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.42518: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882700.42623: variable 'ansible_distribution' from source: facts 30529 1726882700.42626: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.42630: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.42641: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882700.42745: variable 'ansible_distribution' from source: facts 30529 1726882700.42749: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.42753: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.42779: variable 'network_provider' from source: set_fact 30529 1726882700.42792: variable 'ansible_facts' from source: unknown 30529 1726882700.43234: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882700.43238: when evaluation is False, skipping this task 30529 1726882700.43240: _execute() done 30529 1726882700.43243: dumping result to json 30529 1726882700.43245: done dumping result, returning 30529 1726882700.43253: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-0000000024ac] 30529 1726882700.43255: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ac 30529 1726882700.43348: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ac 30529 1726882700.43351: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882700.43401: no more pending results, returning what we have 30529 1726882700.43404: results queue empty 30529 1726882700.43405: checking for any_errors_fatal 30529 1726882700.43411: done checking for any_errors_fatal 30529 1726882700.43411: checking for max_fail_percentage 30529 1726882700.43413: done checking for max_fail_percentage 30529 1726882700.43414: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.43415: done checking to see if all hosts have failed 30529 1726882700.43415: getting the remaining hosts for this loop 30529 1726882700.43417: done getting the remaining hosts for this loop 30529 1726882700.43421: getting the next task for host managed_node1 30529 1726882700.43429: done getting next task for host managed_node1 30529 1726882700.43433: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882700.43437: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.43463: getting variables 30529 1726882700.43464: in VariableManager get_vars() 30529 1726882700.43514: Calling all_inventory to load vars for managed_node1 30529 1726882700.43517: Calling groups_inventory to load vars for managed_node1 30529 1726882700.43519: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.43528: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.43531: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.43533: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.44336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.45197: done with get_vars() 30529 1726882700.45215: done getting variables 30529 1726882700.45257: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:20 -0400 (0:00:00.088) 0:01:54.478 ****** 30529 1726882700.45282: entering _queue_task() for managed_node1/package 30529 1726882700.45525: worker is 1 (out of 1 available) 30529 1726882700.45538: exiting _queue_task() for managed_node1/package 30529 1726882700.45552: done queuing things up, now waiting for results queue to drain 30529 1726882700.45553: waiting for pending results... 30529 1726882700.45756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882700.45857: in run() - task 12673a56-9f93-b0f1-edc0-0000000024ad 30529 1726882700.45869: variable 'ansible_search_path' from source: unknown 30529 1726882700.45872: variable 'ansible_search_path' from source: unknown 30529 1726882700.45905: calling self._execute() 30529 1726882700.45981: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.45984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.45998: variable 'omit' from source: magic vars 30529 1726882700.46275: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.46286: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.46375: variable 'network_state' from source: role '' defaults 30529 1726882700.46384: Evaluated conditional (network_state != {}): False 30529 1726882700.46390: when evaluation is False, skipping this task 30529 1726882700.46395: _execute() done 30529 1726882700.46398: dumping result to json 30529 1726882700.46400: done dumping result, returning 30529 1726882700.46403: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000024ad] 30529 1726882700.46409: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ad 30529 1726882700.46506: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ad 30529 1726882700.46509: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882700.46577: no more pending results, returning what we have 30529 1726882700.46581: results queue empty 30529 1726882700.46582: checking for any_errors_fatal 30529 1726882700.46586: done checking for any_errors_fatal 30529 1726882700.46590: checking for max_fail_percentage 30529 1726882700.46591: done checking for max_fail_percentage 30529 1726882700.46592: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.46595: done checking to see if all hosts have failed 30529 1726882700.46595: getting the remaining hosts for this loop 30529 1726882700.46597: done getting the remaining hosts for this loop 30529 1726882700.46600: getting the next task for host managed_node1 30529 1726882700.46608: done getting next task for host managed_node1 30529 1726882700.46611: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882700.46615: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.46639: getting variables 30529 1726882700.46641: in VariableManager get_vars() 30529 1726882700.46676: Calling all_inventory to load vars for managed_node1 30529 1726882700.46678: Calling groups_inventory to load vars for managed_node1 30529 1726882700.46680: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.46691: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.46699: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.46703: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.47601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.48445: done with get_vars() 30529 1726882700.48461: done getting variables 30529 1726882700.48508: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:20 -0400 (0:00:00.032) 0:01:54.511 ****** 30529 1726882700.48533: entering _queue_task() for managed_node1/package 30529 1726882700.48776: worker is 1 (out of 1 available) 30529 1726882700.48791: exiting _queue_task() for managed_node1/package 30529 1726882700.48805: done queuing things up, now waiting for results queue to drain 30529 1726882700.48807: waiting for pending results... 30529 1726882700.48990: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882700.49095: in run() - task 12673a56-9f93-b0f1-edc0-0000000024ae 30529 1726882700.49105: variable 'ansible_search_path' from source: unknown 30529 1726882700.49109: variable 'ansible_search_path' from source: unknown 30529 1726882700.49136: calling self._execute() 30529 1726882700.49216: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.49221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.49228: variable 'omit' from source: magic vars 30529 1726882700.49503: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.49512: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.49698: variable 'network_state' from source: role '' defaults 30529 1726882700.49701: Evaluated conditional (network_state != {}): False 30529 1726882700.49703: when evaluation is False, skipping this task 30529 1726882700.49705: _execute() done 30529 1726882700.49707: dumping result to json 30529 1726882700.49708: done dumping result, returning 30529 1726882700.49711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-0000000024ae] 30529 1726882700.49712: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ae 30529 1726882700.49781: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024ae 30529 1726882700.49784: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882700.49836: no more pending results, returning what we have 30529 1726882700.49839: results queue empty 30529 1726882700.49840: checking for any_errors_fatal 30529 1726882700.49846: done checking for any_errors_fatal 30529 1726882700.49847: checking for max_fail_percentage 30529 1726882700.49848: done checking for max_fail_percentage 30529 1726882700.49849: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.49849: done checking to see if all hosts have failed 30529 1726882700.49850: getting the remaining hosts for this loop 30529 1726882700.49851: done getting the remaining hosts for this loop 30529 1726882700.49855: getting the next task for host managed_node1 30529 1726882700.49862: done getting next task for host managed_node1 30529 1726882700.49864: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882700.49869: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.49891: getting variables 30529 1726882700.49894: in VariableManager get_vars() 30529 1726882700.49925: Calling all_inventory to load vars for managed_node1 30529 1726882700.49927: Calling groups_inventory to load vars for managed_node1 30529 1726882700.49928: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.49935: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.49937: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.49938: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.50684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.51550: done with get_vars() 30529 1726882700.51564: done getting variables 30529 1726882700.51609: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:20 -0400 (0:00:00.031) 0:01:54.542 ****** 30529 1726882700.51637: entering _queue_task() for managed_node1/service 30529 1726882700.51863: worker is 1 (out of 1 available) 30529 1726882700.51877: exiting _queue_task() for managed_node1/service 30529 1726882700.51892: done queuing things up, now waiting for results queue to drain 30529 1726882700.51895: waiting for pending results... 30529 1726882700.52069: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882700.52171: in run() - task 12673a56-9f93-b0f1-edc0-0000000024af 30529 1726882700.52183: variable 'ansible_search_path' from source: unknown 30529 1726882700.52186: variable 'ansible_search_path' from source: unknown 30529 1726882700.52217: calling self._execute() 30529 1726882700.52296: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.52299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.52308: variable 'omit' from source: magic vars 30529 1726882700.52573: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.52582: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.52670: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.52798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.54515: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.54556: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.54595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.54621: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.54642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.54698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.54718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.54737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.54764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.54775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.54810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.54828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.54844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.54882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.54912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.54928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.54944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.54970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.54981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.55097: variable 'network_connections' from source: include params 30529 1726882700.55106: variable 'interface' from source: play vars 30529 1726882700.55152: variable 'interface' from source: play vars 30529 1726882700.55205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.55312: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.55339: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.55362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.55400: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.55426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.55441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.55458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.55474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.55516: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882700.55666: variable 'network_connections' from source: include params 30529 1726882700.55669: variable 'interface' from source: play vars 30529 1726882700.55714: variable 'interface' from source: play vars 30529 1726882700.55734: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882700.55738: when evaluation is False, skipping this task 30529 1726882700.55740: _execute() done 30529 1726882700.55743: dumping result to json 30529 1726882700.55745: done dumping result, returning 30529 1726882700.55752: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-0000000024af] 30529 1726882700.55756: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024af 30529 1726882700.55848: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024af 30529 1726882700.55858: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882700.55909: no more pending results, returning what we have 30529 1726882700.55913: results queue empty 30529 1726882700.55914: checking for any_errors_fatal 30529 1726882700.55919: done checking for any_errors_fatal 30529 1726882700.55920: checking for max_fail_percentage 30529 1726882700.55921: done checking for max_fail_percentage 30529 1726882700.55922: checking to see if all hosts have failed and the running result is not ok 30529 1726882700.55923: done checking to see if all hosts have failed 30529 1726882700.55924: getting the remaining hosts for this loop 30529 1726882700.55926: done getting the remaining hosts for this loop 30529 1726882700.55930: getting the next task for host managed_node1 30529 1726882700.55938: done getting next task for host managed_node1 30529 1726882700.55942: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882700.55946: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882700.55975: getting variables 30529 1726882700.55977: in VariableManager get_vars() 30529 1726882700.56025: Calling all_inventory to load vars for managed_node1 30529 1726882700.56027: Calling groups_inventory to load vars for managed_node1 30529 1726882700.56029: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882700.56039: Calling all_plugins_play to load vars for managed_node1 30529 1726882700.56042: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882700.56044: Calling groups_plugins_play to load vars for managed_node1 30529 1726882700.56959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882700.57816: done with get_vars() 30529 1726882700.57832: done getting variables 30529 1726882700.57872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:20 -0400 (0:00:00.062) 0:01:54.605 ****** 30529 1726882700.57899: entering _queue_task() for managed_node1/service 30529 1726882700.58132: worker is 1 (out of 1 available) 30529 1726882700.58146: exiting _queue_task() for managed_node1/service 30529 1726882700.58159: done queuing things up, now waiting for results queue to drain 30529 1726882700.58161: waiting for pending results... 30529 1726882700.58341: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882700.58448: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b0 30529 1726882700.58459: variable 'ansible_search_path' from source: unknown 30529 1726882700.58462: variable 'ansible_search_path' from source: unknown 30529 1726882700.58498: calling self._execute() 30529 1726882700.58564: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.58569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.58577: variable 'omit' from source: magic vars 30529 1726882700.58849: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.58859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882700.58973: variable 'network_provider' from source: set_fact 30529 1726882700.58977: variable 'network_state' from source: role '' defaults 30529 1726882700.58985: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882700.58992: variable 'omit' from source: magic vars 30529 1726882700.59031: variable 'omit' from source: magic vars 30529 1726882700.59053: variable 'network_service_name' from source: role '' defaults 30529 1726882700.59100: variable 'network_service_name' from source: role '' defaults 30529 1726882700.59169: variable '__network_provider_setup' from source: role '' defaults 30529 1726882700.59174: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882700.59220: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882700.59228: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882700.59271: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882700.59425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882700.60886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882700.60943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882700.60969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882700.60998: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882700.61022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882700.61077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.61101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.61121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.61147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.61158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.61189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.61209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.61230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.61254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.61266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.61415: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882700.61486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.61507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.61524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.61551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.61559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.61624: variable 'ansible_python' from source: facts 30529 1726882700.61636: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882700.61691: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882700.61745: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882700.61829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.61847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.61863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.61890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.61905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.61937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882700.61955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882700.61971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.62003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882700.62013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882700.62105: variable 'network_connections' from source: include params 30529 1726882700.62111: variable 'interface' from source: play vars 30529 1726882700.62162: variable 'interface' from source: play vars 30529 1726882700.62236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882700.62363: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882700.62401: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882700.62435: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882700.62464: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882700.62509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882700.62532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882700.62554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882700.62576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882700.62616: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.62788: variable 'network_connections' from source: include params 30529 1726882700.62798: variable 'interface' from source: play vars 30529 1726882700.62849: variable 'interface' from source: play vars 30529 1726882700.62872: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882700.62928: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882700.63112: variable 'network_connections' from source: include params 30529 1726882700.63116: variable 'interface' from source: play vars 30529 1726882700.63163: variable 'interface' from source: play vars 30529 1726882700.63181: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882700.63238: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882700.63422: variable 'network_connections' from source: include params 30529 1726882700.63425: variable 'interface' from source: play vars 30529 1726882700.63472: variable 'interface' from source: play vars 30529 1726882700.63517: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882700.63554: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882700.63560: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882700.63604: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882700.63738: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882700.64048: variable 'network_connections' from source: include params 30529 1726882700.64053: variable 'interface' from source: play vars 30529 1726882700.64097: variable 'interface' from source: play vars 30529 1726882700.64104: variable 'ansible_distribution' from source: facts 30529 1726882700.64107: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.64113: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.64124: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882700.64236: variable 'ansible_distribution' from source: facts 30529 1726882700.64240: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.64244: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.64255: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882700.64367: variable 'ansible_distribution' from source: facts 30529 1726882700.64371: variable '__network_rh_distros' from source: role '' defaults 30529 1726882700.64375: variable 'ansible_distribution_major_version' from source: facts 30529 1726882700.64407: variable 'network_provider' from source: set_fact 30529 1726882700.64425: variable 'omit' from source: magic vars 30529 1726882700.64447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882700.64468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882700.64483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882700.64502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882700.64511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882700.64533: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882700.64536: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.64539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.64611: Set connection var ansible_shell_executable to /bin/sh 30529 1726882700.64616: Set connection var ansible_pipelining to False 30529 1726882700.64618: Set connection var ansible_shell_type to sh 30529 1726882700.64626: Set connection var ansible_timeout to 10 30529 1726882700.64628: Set connection var ansible_connection to ssh 30529 1726882700.64633: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882700.64651: variable 'ansible_shell_executable' from source: unknown 30529 1726882700.64654: variable 'ansible_connection' from source: unknown 30529 1726882700.64656: variable 'ansible_module_compression' from source: unknown 30529 1726882700.64658: variable 'ansible_shell_type' from source: unknown 30529 1726882700.64660: variable 'ansible_shell_executable' from source: unknown 30529 1726882700.64662: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882700.64667: variable 'ansible_pipelining' from source: unknown 30529 1726882700.64670: variable 'ansible_timeout' from source: unknown 30529 1726882700.64674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882700.64750: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882700.64758: variable 'omit' from source: magic vars 30529 1726882700.64763: starting attempt loop 30529 1726882700.64766: running the handler 30529 1726882700.64825: variable 'ansible_facts' from source: unknown 30529 1726882700.65413: _low_level_execute_command(): starting 30529 1726882700.65420: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882700.65923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.65937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882700.65951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.65976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882700.65979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882700.65997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882700.66046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882700.67736: stdout chunk (state=3): >>>/root <<< 30529 1726882700.67852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882700.67856: stdout chunk (state=3): >>><<< 30529 1726882700.67864: stderr chunk (state=3): >>><<< 30529 1726882700.67879: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882700.67888: _low_level_execute_command(): starting 30529 1726882700.67897: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406 `" && echo ansible-tmp-1726882700.6787827-35700-3259474776406="` echo /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406 `" ) && sleep 0' 30529 1726882700.68321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882700.68324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.68327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.68329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.68378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882700.68385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882700.68389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882700.68429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882700.70282: stdout chunk (state=3): >>>ansible-tmp-1726882700.6787827-35700-3259474776406=/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406 <<< 30529 1726882700.70389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882700.70411: stderr chunk (state=3): >>><<< 30529 1726882700.70415: stdout chunk (state=3): >>><<< 30529 1726882700.70428: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882700.6787827-35700-3259474776406=/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882700.70451: variable 'ansible_module_compression' from source: unknown 30529 1726882700.70489: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882700.70539: variable 'ansible_facts' from source: unknown 30529 1726882700.70673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py 30529 1726882700.70768: Sending initial data 30529 1726882700.70771: Sent initial data (154 bytes) 30529 1726882700.71202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.71205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882700.71211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.71213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.71216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882700.71218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.71260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882700.71263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882700.71312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882700.72816: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882700.72821: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882700.72854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882700.72900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpdytzhv5v /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py <<< 30529 1726882700.72905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py" <<< 30529 1726882700.72937: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpdytzhv5v" to remote "/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py" <<< 30529 1726882700.73967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882700.74004: stderr chunk (state=3): >>><<< 30529 1726882700.74007: stdout chunk (state=3): >>><<< 30529 1726882700.74091: done transferring module to remote 30529 1726882700.74097: _low_level_execute_command(): starting 30529 1726882700.74100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/ /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py && sleep 0' 30529 1726882700.74459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.74462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882700.74464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882700.74466: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.74468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.74520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882700.74523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882700.74568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882700.76257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882700.76277: stderr chunk (state=3): >>><<< 30529 1726882700.76281: stdout chunk (state=3): >>><<< 30529 1726882700.76299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882700.76302: _low_level_execute_command(): starting 30529 1726882700.76305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/AnsiballZ_systemd.py && sleep 0' 30529 1726882700.76705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.76708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882700.76710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.76712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882700.76715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882700.76717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882700.76762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882700.76765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882700.76817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.05568: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10817536", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300229120", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1961614000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882701.05576: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882701.05599: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882701.07345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882701.07374: stderr chunk (state=3): >>><<< 30529 1726882701.07377: stdout chunk (state=3): >>><<< 30529 1726882701.07397: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10817536", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300229120", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1961614000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882701.07516: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882701.07534: _low_level_execute_command(): starting 30529 1726882701.07537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882700.6787827-35700-3259474776406/ > /dev/null 2>&1 && sleep 0' 30529 1726882701.07990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.07996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.07998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882701.08000: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.08002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.08052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.08056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.08060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.08104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.09879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.09905: stderr chunk (state=3): >>><<< 30529 1726882701.09909: stdout chunk (state=3): >>><<< 30529 1726882701.09923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.09929: handler run complete 30529 1726882701.09965: attempt loop complete, returning result 30529 1726882701.09968: _execute() done 30529 1726882701.09970: dumping result to json 30529 1726882701.09981: done dumping result, returning 30529 1726882701.09994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-0000000024b0] 30529 1726882701.09997: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b0 30529 1726882701.10232: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b0 30529 1726882701.10234: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882701.10297: no more pending results, returning what we have 30529 1726882701.10301: results queue empty 30529 1726882701.10302: checking for any_errors_fatal 30529 1726882701.10307: done checking for any_errors_fatal 30529 1726882701.10308: checking for max_fail_percentage 30529 1726882701.10309: done checking for max_fail_percentage 30529 1726882701.10310: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.10311: done checking to see if all hosts have failed 30529 1726882701.10312: getting the remaining hosts for this loop 30529 1726882701.10313: done getting the remaining hosts for this loop 30529 1726882701.10316: getting the next task for host managed_node1 30529 1726882701.10324: done getting next task for host managed_node1 30529 1726882701.10327: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882701.10332: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.10346: getting variables 30529 1726882701.10348: in VariableManager get_vars() 30529 1726882701.10383: Calling all_inventory to load vars for managed_node1 30529 1726882701.10385: Calling groups_inventory to load vars for managed_node1 30529 1726882701.10387: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.10401: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.10404: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.10407: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.11301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.12156: done with get_vars() 30529 1726882701.12173: done getting variables 30529 1726882701.12219: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:21 -0400 (0:00:00.543) 0:01:55.148 ****** 30529 1726882701.12248: entering _queue_task() for managed_node1/service 30529 1726882701.12499: worker is 1 (out of 1 available) 30529 1726882701.12512: exiting _queue_task() for managed_node1/service 30529 1726882701.12524: done queuing things up, now waiting for results queue to drain 30529 1726882701.12526: waiting for pending results... 30529 1726882701.12717: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882701.12813: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b1 30529 1726882701.12826: variable 'ansible_search_path' from source: unknown 30529 1726882701.12830: variable 'ansible_search_path' from source: unknown 30529 1726882701.12861: calling self._execute() 30529 1726882701.12938: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.12942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.12950: variable 'omit' from source: magic vars 30529 1726882701.13230: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.13240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.13325: variable 'network_provider' from source: set_fact 30529 1726882701.13329: Evaluated conditional (network_provider == "nm"): True 30529 1726882701.13390: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882701.13456: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882701.13573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882701.15019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882701.15064: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882701.15091: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882701.15122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882701.15143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882701.15216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882701.15237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882701.15255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882701.15285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882701.15300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882701.15332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882701.15349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882701.15367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882701.15397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882701.15407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882701.15434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882701.15450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882701.15467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882701.15497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882701.15509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882701.15605: variable 'network_connections' from source: include params 30529 1726882701.15616: variable 'interface' from source: play vars 30529 1726882701.15663: variable 'interface' from source: play vars 30529 1726882701.15718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882701.15827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882701.15854: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882701.15878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882701.15905: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882701.15936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882701.15951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882701.15968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882701.15984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882701.16028: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882701.16182: variable 'network_connections' from source: include params 30529 1726882701.16186: variable 'interface' from source: play vars 30529 1726882701.16233: variable 'interface' from source: play vars 30529 1726882701.16257: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882701.16260: when evaluation is False, skipping this task 30529 1726882701.16263: _execute() done 30529 1726882701.16265: dumping result to json 30529 1726882701.16267: done dumping result, returning 30529 1726882701.16274: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-0000000024b1] 30529 1726882701.16284: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b1 30529 1726882701.16369: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b1 30529 1726882701.16372: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882701.16421: no more pending results, returning what we have 30529 1726882701.16425: results queue empty 30529 1726882701.16426: checking for any_errors_fatal 30529 1726882701.16447: done checking for any_errors_fatal 30529 1726882701.16448: checking for max_fail_percentage 30529 1726882701.16450: done checking for max_fail_percentage 30529 1726882701.16451: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.16452: done checking to see if all hosts have failed 30529 1726882701.16452: getting the remaining hosts for this loop 30529 1726882701.16454: done getting the remaining hosts for this loop 30529 1726882701.16458: getting the next task for host managed_node1 30529 1726882701.16467: done getting next task for host managed_node1 30529 1726882701.16471: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882701.16475: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.16510: getting variables 30529 1726882701.16513: in VariableManager get_vars() 30529 1726882701.16554: Calling all_inventory to load vars for managed_node1 30529 1726882701.16556: Calling groups_inventory to load vars for managed_node1 30529 1726882701.16558: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.16568: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.16570: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.16572: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.17356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.18229: done with get_vars() 30529 1726882701.18247: done getting variables 30529 1726882701.18290: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:21 -0400 (0:00:00.060) 0:01:55.209 ****** 30529 1726882701.18316: entering _queue_task() for managed_node1/service 30529 1726882701.18557: worker is 1 (out of 1 available) 30529 1726882701.18569: exiting _queue_task() for managed_node1/service 30529 1726882701.18583: done queuing things up, now waiting for results queue to drain 30529 1726882701.18585: waiting for pending results... 30529 1726882701.18765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882701.18870: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b2 30529 1726882701.18881: variable 'ansible_search_path' from source: unknown 30529 1726882701.18885: variable 'ansible_search_path' from source: unknown 30529 1726882701.18921: calling self._execute() 30529 1726882701.18989: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.18997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.19006: variable 'omit' from source: magic vars 30529 1726882701.19273: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.19282: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.19365: variable 'network_provider' from source: set_fact 30529 1726882701.19368: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882701.19371: when evaluation is False, skipping this task 30529 1726882701.19375: _execute() done 30529 1726882701.19377: dumping result to json 30529 1726882701.19382: done dumping result, returning 30529 1726882701.19388: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-0000000024b2] 30529 1726882701.19397: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b2 30529 1726882701.19480: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b2 30529 1726882701.19483: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882701.19529: no more pending results, returning what we have 30529 1726882701.19533: results queue empty 30529 1726882701.19534: checking for any_errors_fatal 30529 1726882701.19543: done checking for any_errors_fatal 30529 1726882701.19544: checking for max_fail_percentage 30529 1726882701.19546: done checking for max_fail_percentage 30529 1726882701.19547: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.19548: done checking to see if all hosts have failed 30529 1726882701.19548: getting the remaining hosts for this loop 30529 1726882701.19550: done getting the remaining hosts for this loop 30529 1726882701.19553: getting the next task for host managed_node1 30529 1726882701.19561: done getting next task for host managed_node1 30529 1726882701.19564: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882701.19568: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.19593: getting variables 30529 1726882701.19595: in VariableManager get_vars() 30529 1726882701.19633: Calling all_inventory to load vars for managed_node1 30529 1726882701.19635: Calling groups_inventory to load vars for managed_node1 30529 1726882701.19638: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.19646: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.19649: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.19651: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.20531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.21376: done with get_vars() 30529 1726882701.21395: done getting variables 30529 1726882701.21436: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:21 -0400 (0:00:00.031) 0:01:55.240 ****** 30529 1726882701.21460: entering _queue_task() for managed_node1/copy 30529 1726882701.21663: worker is 1 (out of 1 available) 30529 1726882701.21676: exiting _queue_task() for managed_node1/copy 30529 1726882701.21691: done queuing things up, now waiting for results queue to drain 30529 1726882701.21694: waiting for pending results... 30529 1726882701.21872: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882701.21968: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b3 30529 1726882701.21980: variable 'ansible_search_path' from source: unknown 30529 1726882701.21985: variable 'ansible_search_path' from source: unknown 30529 1726882701.22013: calling self._execute() 30529 1726882701.22086: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.22095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.22101: variable 'omit' from source: magic vars 30529 1726882701.22364: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.22373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.22454: variable 'network_provider' from source: set_fact 30529 1726882701.22460: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882701.22463: when evaluation is False, skipping this task 30529 1726882701.22466: _execute() done 30529 1726882701.22468: dumping result to json 30529 1726882701.22471: done dumping result, returning 30529 1726882701.22480: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-0000000024b3] 30529 1726882701.22483: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b3 30529 1726882701.22571: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b3 30529 1726882701.22574: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882701.22630: no more pending results, returning what we have 30529 1726882701.22634: results queue empty 30529 1726882701.22635: checking for any_errors_fatal 30529 1726882701.22639: done checking for any_errors_fatal 30529 1726882701.22640: checking for max_fail_percentage 30529 1726882701.22641: done checking for max_fail_percentage 30529 1726882701.22642: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.22643: done checking to see if all hosts have failed 30529 1726882701.22644: getting the remaining hosts for this loop 30529 1726882701.22645: done getting the remaining hosts for this loop 30529 1726882701.22649: getting the next task for host managed_node1 30529 1726882701.22656: done getting next task for host managed_node1 30529 1726882701.22658: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882701.22663: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.22685: getting variables 30529 1726882701.22686: in VariableManager get_vars() 30529 1726882701.22725: Calling all_inventory to load vars for managed_node1 30529 1726882701.22728: Calling groups_inventory to load vars for managed_node1 30529 1726882701.22730: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.22737: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.22740: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.22742: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.23477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.24344: done with get_vars() 30529 1726882701.24359: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:21 -0400 (0:00:00.029) 0:01:55.270 ****** 30529 1726882701.24422: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882701.24636: worker is 1 (out of 1 available) 30529 1726882701.24649: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882701.24664: done queuing things up, now waiting for results queue to drain 30529 1726882701.24665: waiting for pending results... 30529 1726882701.24842: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882701.24920: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b4 30529 1726882701.24933: variable 'ansible_search_path' from source: unknown 30529 1726882701.24937: variable 'ansible_search_path' from source: unknown 30529 1726882701.24962: calling self._execute() 30529 1726882701.25040: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.25045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.25053: variable 'omit' from source: magic vars 30529 1726882701.25328: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.25335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.25340: variable 'omit' from source: magic vars 30529 1726882701.25390: variable 'omit' from source: magic vars 30529 1726882701.25497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882701.27165: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882701.27215: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882701.27242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882701.27266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882701.27290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882701.27349: variable 'network_provider' from source: set_fact 30529 1726882701.27443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882701.27463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882701.27481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882701.27513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882701.27524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882701.27577: variable 'omit' from source: magic vars 30529 1726882701.27652: variable 'omit' from source: magic vars 30529 1726882701.27727: variable 'network_connections' from source: include params 30529 1726882701.27733: variable 'interface' from source: play vars 30529 1726882701.27777: variable 'interface' from source: play vars 30529 1726882701.27878: variable 'omit' from source: magic vars 30529 1726882701.27885: variable '__lsr_ansible_managed' from source: task vars 30529 1726882701.27928: variable '__lsr_ansible_managed' from source: task vars 30529 1726882701.28062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882701.28202: Loaded config def from plugin (lookup/template) 30529 1726882701.28206: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882701.28227: File lookup term: get_ansible_managed.j2 30529 1726882701.28229: variable 'ansible_search_path' from source: unknown 30529 1726882701.28233: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882701.28244: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882701.28257: variable 'ansible_search_path' from source: unknown 30529 1726882701.31520: variable 'ansible_managed' from source: unknown 30529 1726882701.31599: variable 'omit' from source: magic vars 30529 1726882701.31626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882701.31645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882701.31660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882701.31673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.31681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.31708: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882701.31711: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.31713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.31780: Set connection var ansible_shell_executable to /bin/sh 30529 1726882701.31783: Set connection var ansible_pipelining to False 30529 1726882701.31786: Set connection var ansible_shell_type to sh 30529 1726882701.31799: Set connection var ansible_timeout to 10 30529 1726882701.31801: Set connection var ansible_connection to ssh 30529 1726882701.31803: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882701.31820: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.31823: variable 'ansible_connection' from source: unknown 30529 1726882701.31825: variable 'ansible_module_compression' from source: unknown 30529 1726882701.31827: variable 'ansible_shell_type' from source: unknown 30529 1726882701.31830: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.31832: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.31837: variable 'ansible_pipelining' from source: unknown 30529 1726882701.31839: variable 'ansible_timeout' from source: unknown 30529 1726882701.31841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.31933: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882701.31945: variable 'omit' from source: magic vars 30529 1726882701.31948: starting attempt loop 30529 1726882701.31953: running the handler 30529 1726882701.31966: _low_level_execute_command(): starting 30529 1726882701.31972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882701.32468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.32472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.32474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.32476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.32533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.32537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.32539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.32592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.34183: stdout chunk (state=3): >>>/root <<< 30529 1726882701.34280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.34311: stderr chunk (state=3): >>><<< 30529 1726882701.34316: stdout chunk (state=3): >>><<< 30529 1726882701.34335: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.34345: _low_level_execute_command(): starting 30529 1726882701.34350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096 `" && echo ansible-tmp-1726882701.3433478-35714-43359612823096="` echo /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096 `" ) && sleep 0' 30529 1726882701.34778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.34781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.34783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882701.34786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882701.34790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.34841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.34848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.34850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.34889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.36745: stdout chunk (state=3): >>>ansible-tmp-1726882701.3433478-35714-43359612823096=/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096 <<< 30529 1726882701.36847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.36874: stderr chunk (state=3): >>><<< 30529 1726882701.36877: stdout chunk (state=3): >>><<< 30529 1726882701.36895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882701.3433478-35714-43359612823096=/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.36933: variable 'ansible_module_compression' from source: unknown 30529 1726882701.36968: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882701.36995: variable 'ansible_facts' from source: unknown 30529 1726882701.37059: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py 30529 1726882701.37155: Sending initial data 30529 1726882701.37159: Sent initial data (167 bytes) 30529 1726882701.37604: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.37608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882701.37614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.37617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.37619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.37664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.37671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.37673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.37712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.39240: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882701.39243: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882701.39274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882701.39316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpxca1wzq3 /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py <<< 30529 1726882701.39319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py" <<< 30529 1726882701.39360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpxca1wzq3" to remote "/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py" <<< 30529 1726882701.40063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.40105: stderr chunk (state=3): >>><<< 30529 1726882701.40109: stdout chunk (state=3): >>><<< 30529 1726882701.40147: done transferring module to remote 30529 1726882701.40156: _low_level_execute_command(): starting 30529 1726882701.40161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/ /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py && sleep 0' 30529 1726882701.40596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.40599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882701.40601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882701.40603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.40605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.40653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.40657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.40706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.42412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.42439: stderr chunk (state=3): >>><<< 30529 1726882701.42443: stdout chunk (state=3): >>><<< 30529 1726882701.42455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.42459: _low_level_execute_command(): starting 30529 1726882701.42463: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/AnsiballZ_network_connections.py && sleep 0' 30529 1726882701.42891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882701.42897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.42912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.42965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.42977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.42979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.43018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.74886: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_andt9i_e/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_andt9i_e/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/00b37fa6-807a-4f96-b822-2aecde64bf67: error=unknown <<< 30529 1726882701.75207: stdout chunk (state=3): >>> <<< 30529 1726882701.75214: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882701.76928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882701.76960: stderr chunk (state=3): >>><<< 30529 1726882701.76963: stdout chunk (state=3): >>><<< 30529 1726882701.76979: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_andt9i_e/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_andt9i_e/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/00b37fa6-807a-4f96-b822-2aecde64bf67: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882701.77011: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882701.77019: _low_level_execute_command(): starting 30529 1726882701.77024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882701.3433478-35714-43359612823096/ > /dev/null 2>&1 && sleep 0' 30529 1726882701.77476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882701.77480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882701.77482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.77484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.77486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.77539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.77542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.77585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.79372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.79398: stderr chunk (state=3): >>><<< 30529 1726882701.79401: stdout chunk (state=3): >>><<< 30529 1726882701.79414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.79421: handler run complete 30529 1726882701.79437: attempt loop complete, returning result 30529 1726882701.79440: _execute() done 30529 1726882701.79442: dumping result to json 30529 1726882701.79447: done dumping result, returning 30529 1726882701.79456: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-0000000024b4] 30529 1726882701.79459: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b4 30529 1726882701.79552: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b4 30529 1726882701.79555: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30529 1726882701.79666: no more pending results, returning what we have 30529 1726882701.79669: results queue empty 30529 1726882701.79670: checking for any_errors_fatal 30529 1726882701.79676: done checking for any_errors_fatal 30529 1726882701.79677: checking for max_fail_percentage 30529 1726882701.79679: done checking for max_fail_percentage 30529 1726882701.79680: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.79680: done checking to see if all hosts have failed 30529 1726882701.79681: getting the remaining hosts for this loop 30529 1726882701.79683: done getting the remaining hosts for this loop 30529 1726882701.79686: getting the next task for host managed_node1 30529 1726882701.79695: done getting next task for host managed_node1 30529 1726882701.79698: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882701.79702: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.79715: getting variables 30529 1726882701.79716: in VariableManager get_vars() 30529 1726882701.79757: Calling all_inventory to load vars for managed_node1 30529 1726882701.79759: Calling groups_inventory to load vars for managed_node1 30529 1726882701.79762: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.79771: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.79773: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.79776: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.80730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.81578: done with get_vars() 30529 1726882701.81596: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:21 -0400 (0:00:00.572) 0:01:55.842 ****** 30529 1726882701.81659: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882701.81905: worker is 1 (out of 1 available) 30529 1726882701.81921: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882701.81934: done queuing things up, now waiting for results queue to drain 30529 1726882701.81936: waiting for pending results... 30529 1726882701.82131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882701.82215: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b5 30529 1726882701.82227: variable 'ansible_search_path' from source: unknown 30529 1726882701.82231: variable 'ansible_search_path' from source: unknown 30529 1726882701.82259: calling self._execute() 30529 1726882701.82343: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.82347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.82356: variable 'omit' from source: magic vars 30529 1726882701.82648: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.82657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.82747: variable 'network_state' from source: role '' defaults 30529 1726882701.82756: Evaluated conditional (network_state != {}): False 30529 1726882701.82759: when evaluation is False, skipping this task 30529 1726882701.82761: _execute() done 30529 1726882701.82764: dumping result to json 30529 1726882701.82766: done dumping result, returning 30529 1726882701.82773: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-0000000024b5] 30529 1726882701.82778: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b5 30529 1726882701.82867: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b5 30529 1726882701.82870: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882701.82923: no more pending results, returning what we have 30529 1726882701.82926: results queue empty 30529 1726882701.82927: checking for any_errors_fatal 30529 1726882701.82940: done checking for any_errors_fatal 30529 1726882701.82940: checking for max_fail_percentage 30529 1726882701.82942: done checking for max_fail_percentage 30529 1726882701.82943: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.82944: done checking to see if all hosts have failed 30529 1726882701.82944: getting the remaining hosts for this loop 30529 1726882701.82946: done getting the remaining hosts for this loop 30529 1726882701.82950: getting the next task for host managed_node1 30529 1726882701.82957: done getting next task for host managed_node1 30529 1726882701.82961: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882701.82966: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.82992: getting variables 30529 1726882701.82996: in VariableManager get_vars() 30529 1726882701.83036: Calling all_inventory to load vars for managed_node1 30529 1726882701.83038: Calling groups_inventory to load vars for managed_node1 30529 1726882701.83040: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.83050: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.83053: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.83055: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.83848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.85422: done with get_vars() 30529 1726882701.85443: done getting variables 30529 1726882701.85504: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:21 -0400 (0:00:00.038) 0:01:55.881 ****** 30529 1726882701.85540: entering _queue_task() for managed_node1/debug 30529 1726882701.85801: worker is 1 (out of 1 available) 30529 1726882701.85814: exiting _queue_task() for managed_node1/debug 30529 1726882701.85827: done queuing things up, now waiting for results queue to drain 30529 1726882701.85829: waiting for pending results... 30529 1726882701.86023: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882701.86112: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b6 30529 1726882701.86123: variable 'ansible_search_path' from source: unknown 30529 1726882701.86126: variable 'ansible_search_path' from source: unknown 30529 1726882701.86154: calling self._execute() 30529 1726882701.86234: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.86237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.86246: variable 'omit' from source: magic vars 30529 1726882701.86529: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.86539: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.86545: variable 'omit' from source: magic vars 30529 1726882701.86587: variable 'omit' from source: magic vars 30529 1726882701.86616: variable 'omit' from source: magic vars 30529 1726882701.86647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882701.86674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882701.86696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882701.86711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.86723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.86745: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882701.86748: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.86751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.86826: Set connection var ansible_shell_executable to /bin/sh 30529 1726882701.86829: Set connection var ansible_pipelining to False 30529 1726882701.86832: Set connection var ansible_shell_type to sh 30529 1726882701.86839: Set connection var ansible_timeout to 10 30529 1726882701.86842: Set connection var ansible_connection to ssh 30529 1726882701.86846: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882701.86863: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.86866: variable 'ansible_connection' from source: unknown 30529 1726882701.86869: variable 'ansible_module_compression' from source: unknown 30529 1726882701.86871: variable 'ansible_shell_type' from source: unknown 30529 1726882701.86873: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.86875: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.86877: variable 'ansible_pipelining' from source: unknown 30529 1726882701.86881: variable 'ansible_timeout' from source: unknown 30529 1726882701.86885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.86985: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882701.87000: variable 'omit' from source: magic vars 30529 1726882701.87004: starting attempt loop 30529 1726882701.87007: running the handler 30529 1726882701.87100: variable '__network_connections_result' from source: set_fact 30529 1726882701.87139: handler run complete 30529 1726882701.87152: attempt loop complete, returning result 30529 1726882701.87155: _execute() done 30529 1726882701.87158: dumping result to json 30529 1726882701.87160: done dumping result, returning 30529 1726882701.87169: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000024b6] 30529 1726882701.87172: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b6 30529 1726882701.87256: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b6 30529 1726882701.87258: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 30529 1726882701.87331: no more pending results, returning what we have 30529 1726882701.87335: results queue empty 30529 1726882701.87336: checking for any_errors_fatal 30529 1726882701.87342: done checking for any_errors_fatal 30529 1726882701.87342: checking for max_fail_percentage 30529 1726882701.87344: done checking for max_fail_percentage 30529 1726882701.87345: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.87346: done checking to see if all hosts have failed 30529 1726882701.87346: getting the remaining hosts for this loop 30529 1726882701.87348: done getting the remaining hosts for this loop 30529 1726882701.87351: getting the next task for host managed_node1 30529 1726882701.87359: done getting next task for host managed_node1 30529 1726882701.87362: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882701.87366: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.87378: getting variables 30529 1726882701.87379: in VariableManager get_vars() 30529 1726882701.87418: Calling all_inventory to load vars for managed_node1 30529 1726882701.87421: Calling groups_inventory to load vars for managed_node1 30529 1726882701.87423: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.87432: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.87434: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.87437: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.88214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.89066: done with get_vars() 30529 1726882701.89081: done getting variables 30529 1726882701.89123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:21 -0400 (0:00:00.036) 0:01:55.917 ****** 30529 1726882701.89152: entering _queue_task() for managed_node1/debug 30529 1726882701.89364: worker is 1 (out of 1 available) 30529 1726882701.89378: exiting _queue_task() for managed_node1/debug 30529 1726882701.89390: done queuing things up, now waiting for results queue to drain 30529 1726882701.89392: waiting for pending results... 30529 1726882701.89568: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882701.89669: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b7 30529 1726882701.89682: variable 'ansible_search_path' from source: unknown 30529 1726882701.89685: variable 'ansible_search_path' from source: unknown 30529 1726882701.89717: calling self._execute() 30529 1726882701.89795: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.89799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.89805: variable 'omit' from source: magic vars 30529 1726882701.90075: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.90085: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.90092: variable 'omit' from source: magic vars 30529 1726882701.90135: variable 'omit' from source: magic vars 30529 1726882701.90159: variable 'omit' from source: magic vars 30529 1726882701.90195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882701.90221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882701.90238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882701.90251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.90263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.90294: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882701.90298: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.90300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.90365: Set connection var ansible_shell_executable to /bin/sh 30529 1726882701.90368: Set connection var ansible_pipelining to False 30529 1726882701.90371: Set connection var ansible_shell_type to sh 30529 1726882701.90380: Set connection var ansible_timeout to 10 30529 1726882701.90383: Set connection var ansible_connection to ssh 30529 1726882701.90385: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882701.90408: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.90411: variable 'ansible_connection' from source: unknown 30529 1726882701.90415: variable 'ansible_module_compression' from source: unknown 30529 1726882701.90418: variable 'ansible_shell_type' from source: unknown 30529 1726882701.90420: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.90422: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.90425: variable 'ansible_pipelining' from source: unknown 30529 1726882701.90427: variable 'ansible_timeout' from source: unknown 30529 1726882701.90429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.90528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882701.90538: variable 'omit' from source: magic vars 30529 1726882701.90543: starting attempt loop 30529 1726882701.90546: running the handler 30529 1726882701.90585: variable '__network_connections_result' from source: set_fact 30529 1726882701.90644: variable '__network_connections_result' from source: set_fact 30529 1726882701.90720: handler run complete 30529 1726882701.90736: attempt loop complete, returning result 30529 1726882701.90739: _execute() done 30529 1726882701.90741: dumping result to json 30529 1726882701.90744: done dumping result, returning 30529 1726882701.90752: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000024b7] 30529 1726882701.90755: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b7 30529 1726882701.90844: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b7 30529 1726882701.90847: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30529 1726882701.90938: no more pending results, returning what we have 30529 1726882701.90942: results queue empty 30529 1726882701.90943: checking for any_errors_fatal 30529 1726882701.90947: done checking for any_errors_fatal 30529 1726882701.90948: checking for max_fail_percentage 30529 1726882701.90950: done checking for max_fail_percentage 30529 1726882701.90951: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.90951: done checking to see if all hosts have failed 30529 1726882701.90952: getting the remaining hosts for this loop 30529 1726882701.90954: done getting the remaining hosts for this loop 30529 1726882701.90959: getting the next task for host managed_node1 30529 1726882701.90967: done getting next task for host managed_node1 30529 1726882701.90970: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882701.90974: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.90985: getting variables 30529 1726882701.90989: in VariableManager get_vars() 30529 1726882701.91026: Calling all_inventory to load vars for managed_node1 30529 1726882701.91028: Calling groups_inventory to load vars for managed_node1 30529 1726882701.91031: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.91038: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.91041: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.91049: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.91951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.92786: done with get_vars() 30529 1726882701.92806: done getting variables 30529 1726882701.92845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:21 -0400 (0:00:00.037) 0:01:55.954 ****** 30529 1726882701.92868: entering _queue_task() for managed_node1/debug 30529 1726882701.93083: worker is 1 (out of 1 available) 30529 1726882701.93102: exiting _queue_task() for managed_node1/debug 30529 1726882701.93114: done queuing things up, now waiting for results queue to drain 30529 1726882701.93116: waiting for pending results... 30529 1726882701.93298: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882701.93387: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b8 30529 1726882701.93401: variable 'ansible_search_path' from source: unknown 30529 1726882701.93406: variable 'ansible_search_path' from source: unknown 30529 1726882701.93430: calling self._execute() 30529 1726882701.93505: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.93509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.93517: variable 'omit' from source: magic vars 30529 1726882701.93787: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.93798: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.93877: variable 'network_state' from source: role '' defaults 30529 1726882701.93891: Evaluated conditional (network_state != {}): False 30529 1726882701.93895: when evaluation is False, skipping this task 30529 1726882701.93898: _execute() done 30529 1726882701.93901: dumping result to json 30529 1726882701.93904: done dumping result, returning 30529 1726882701.93910: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-0000000024b8] 30529 1726882701.93913: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b8 30529 1726882701.93995: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b8 30529 1726882701.93998: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882701.94046: no more pending results, returning what we have 30529 1726882701.94049: results queue empty 30529 1726882701.94051: checking for any_errors_fatal 30529 1726882701.94058: done checking for any_errors_fatal 30529 1726882701.94059: checking for max_fail_percentage 30529 1726882701.94060: done checking for max_fail_percentage 30529 1726882701.94061: checking to see if all hosts have failed and the running result is not ok 30529 1726882701.94062: done checking to see if all hosts have failed 30529 1726882701.94063: getting the remaining hosts for this loop 30529 1726882701.94064: done getting the remaining hosts for this loop 30529 1726882701.94068: getting the next task for host managed_node1 30529 1726882701.94075: done getting next task for host managed_node1 30529 1726882701.94078: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882701.94083: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882701.94115: getting variables 30529 1726882701.94117: in VariableManager get_vars() 30529 1726882701.94151: Calling all_inventory to load vars for managed_node1 30529 1726882701.94154: Calling groups_inventory to load vars for managed_node1 30529 1726882701.94156: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882701.94163: Calling all_plugins_play to load vars for managed_node1 30529 1726882701.94166: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882701.94168: Calling groups_plugins_play to load vars for managed_node1 30529 1726882701.94909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882701.95768: done with get_vars() 30529 1726882701.95783: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:21 -0400 (0:00:00.029) 0:01:55.984 ****** 30529 1726882701.95850: entering _queue_task() for managed_node1/ping 30529 1726882701.96055: worker is 1 (out of 1 available) 30529 1726882701.96069: exiting _queue_task() for managed_node1/ping 30529 1726882701.96081: done queuing things up, now waiting for results queue to drain 30529 1726882701.96083: waiting for pending results... 30529 1726882701.96253: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882701.96348: in run() - task 12673a56-9f93-b0f1-edc0-0000000024b9 30529 1726882701.96360: variable 'ansible_search_path' from source: unknown 30529 1726882701.96364: variable 'ansible_search_path' from source: unknown 30529 1726882701.96392: calling self._execute() 30529 1726882701.96459: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.96463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.96471: variable 'omit' from source: magic vars 30529 1726882701.96729: variable 'ansible_distribution_major_version' from source: facts 30529 1726882701.96740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882701.96743: variable 'omit' from source: magic vars 30529 1726882701.96786: variable 'omit' from source: magic vars 30529 1726882701.96808: variable 'omit' from source: magic vars 30529 1726882701.96838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882701.96865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882701.96881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882701.96898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.96908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882701.96931: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882701.96934: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.96936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.97009: Set connection var ansible_shell_executable to /bin/sh 30529 1726882701.97012: Set connection var ansible_pipelining to False 30529 1726882701.97015: Set connection var ansible_shell_type to sh 30529 1726882701.97022: Set connection var ansible_timeout to 10 30529 1726882701.97024: Set connection var ansible_connection to ssh 30529 1726882701.97030: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882701.97045: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.97048: variable 'ansible_connection' from source: unknown 30529 1726882701.97051: variable 'ansible_module_compression' from source: unknown 30529 1726882701.97053: variable 'ansible_shell_type' from source: unknown 30529 1726882701.97055: variable 'ansible_shell_executable' from source: unknown 30529 1726882701.97057: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882701.97061: variable 'ansible_pipelining' from source: unknown 30529 1726882701.97064: variable 'ansible_timeout' from source: unknown 30529 1726882701.97068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882701.97208: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882701.97217: variable 'omit' from source: magic vars 30529 1726882701.97223: starting attempt loop 30529 1726882701.97225: running the handler 30529 1726882701.97236: _low_level_execute_command(): starting 30529 1726882701.97242: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882701.97754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882701.97758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882701.97761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882701.97764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882701.97819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882701.97821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882701.97824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882701.97874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882701.99463: stdout chunk (state=3): >>>/root <<< 30529 1726882701.99568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882701.99597: stderr chunk (state=3): >>><<< 30529 1726882701.99601: stdout chunk (state=3): >>><<< 30529 1726882701.99620: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882701.99631: _low_level_execute_command(): starting 30529 1726882701.99637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050 `" && echo ansible-tmp-1726882701.9962015-35729-223646054349050="` echo /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050 `" ) && sleep 0' 30529 1726882702.00042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.00045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.00055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882702.00058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.00107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.00114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.00153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.02006: stdout chunk (state=3): >>>ansible-tmp-1726882701.9962015-35729-223646054349050=/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050 <<< 30529 1726882702.02108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.02131: stderr chunk (state=3): >>><<< 30529 1726882702.02134: stdout chunk (state=3): >>><<< 30529 1726882702.02146: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882701.9962015-35729-223646054349050=/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.02179: variable 'ansible_module_compression' from source: unknown 30529 1726882702.02212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882702.02244: variable 'ansible_facts' from source: unknown 30529 1726882702.02297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py 30529 1726882702.02386: Sending initial data 30529 1726882702.02392: Sent initial data (153 bytes) 30529 1726882702.02819: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.02822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882702.02824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.02826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.02828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882702.02830: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.02875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.02879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.02923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.04422: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882702.04428: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882702.04459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882702.04505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwjx91fsq /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py <<< 30529 1726882702.04514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py" <<< 30529 1726882702.04547: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpwjx91fsq" to remote "/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py" <<< 30529 1726882702.05034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.05068: stderr chunk (state=3): >>><<< 30529 1726882702.05071: stdout chunk (state=3): >>><<< 30529 1726882702.05103: done transferring module to remote 30529 1726882702.05111: _low_level_execute_command(): starting 30529 1726882702.05114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/ /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py && sleep 0' 30529 1726882702.05525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.05528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882702.05530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.05532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.05538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882702.05540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.05583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.05587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.05632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.07333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.07354: stderr chunk (state=3): >>><<< 30529 1726882702.07357: stdout chunk (state=3): >>><<< 30529 1726882702.07369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.07372: _low_level_execute_command(): starting 30529 1726882702.07376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/AnsiballZ_ping.py && sleep 0' 30529 1726882702.07775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.07778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882702.07780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.07782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.07784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.07832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.07835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.07883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.22560: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882702.23743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882702.23772: stderr chunk (state=3): >>><<< 30529 1726882702.23775: stdout chunk (state=3): >>><<< 30529 1726882702.23792: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882702.23816: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882702.23826: _low_level_execute_command(): starting 30529 1726882702.23831: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882701.9962015-35729-223646054349050/ > /dev/null 2>&1 && sleep 0' 30529 1726882702.24275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.24279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.24294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.24352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.24355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882702.24364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.24408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.26222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.26250: stderr chunk (state=3): >>><<< 30529 1726882702.26253: stdout chunk (state=3): >>><<< 30529 1726882702.26267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.26272: handler run complete 30529 1726882702.26286: attempt loop complete, returning result 30529 1726882702.26291: _execute() done 30529 1726882702.26296: dumping result to json 30529 1726882702.26298: done dumping result, returning 30529 1726882702.26306: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-0000000024b9] 30529 1726882702.26310: sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b9 30529 1726882702.26400: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000024b9 30529 1726882702.26402: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882702.26473: no more pending results, returning what we have 30529 1726882702.26476: results queue empty 30529 1726882702.26477: checking for any_errors_fatal 30529 1726882702.26484: done checking for any_errors_fatal 30529 1726882702.26485: checking for max_fail_percentage 30529 1726882702.26487: done checking for max_fail_percentage 30529 1726882702.26490: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.26491: done checking to see if all hosts have failed 30529 1726882702.26492: getting the remaining hosts for this loop 30529 1726882702.26495: done getting the remaining hosts for this loop 30529 1726882702.26499: getting the next task for host managed_node1 30529 1726882702.26512: done getting next task for host managed_node1 30529 1726882702.26514: ^ task is: TASK: meta (role_complete) 30529 1726882702.26518: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.26532: getting variables 30529 1726882702.26534: in VariableManager get_vars() 30529 1726882702.26579: Calling all_inventory to load vars for managed_node1 30529 1726882702.26581: Calling groups_inventory to load vars for managed_node1 30529 1726882702.26583: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.26600: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.26603: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.26606: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.27542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.28395: done with get_vars() 30529 1726882702.28412: done getting variables 30529 1726882702.28470: done queuing things up, now waiting for results queue to drain 30529 1726882702.28471: results queue empty 30529 1726882702.28472: checking for any_errors_fatal 30529 1726882702.28473: done checking for any_errors_fatal 30529 1726882702.28474: checking for max_fail_percentage 30529 1726882702.28474: done checking for max_fail_percentage 30529 1726882702.28475: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.28475: done checking to see if all hosts have failed 30529 1726882702.28476: getting the remaining hosts for this loop 30529 1726882702.28476: done getting the remaining hosts for this loop 30529 1726882702.28478: getting the next task for host managed_node1 30529 1726882702.28481: done getting next task for host managed_node1 30529 1726882702.28483: ^ task is: TASK: Test 30529 1726882702.28484: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.28486: getting variables 30529 1726882702.28486: in VariableManager get_vars() 30529 1726882702.28498: Calling all_inventory to load vars for managed_node1 30529 1726882702.28500: Calling groups_inventory to load vars for managed_node1 30529 1726882702.28501: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.28505: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.28506: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.28508: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.29127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.30046: done with get_vars() 30529 1726882702.30061: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:38:22 -0400 (0:00:00.342) 0:01:56.327 ****** 30529 1726882702.30113: entering _queue_task() for managed_node1/include_tasks 30529 1726882702.30352: worker is 1 (out of 1 available) 30529 1726882702.30366: exiting _queue_task() for managed_node1/include_tasks 30529 1726882702.30379: done queuing things up, now waiting for results queue to drain 30529 1726882702.30381: waiting for pending results... 30529 1726882702.30562: running TaskExecutor() for managed_node1/TASK: Test 30529 1726882702.30638: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b1 30529 1726882702.30650: variable 'ansible_search_path' from source: unknown 30529 1726882702.30654: variable 'ansible_search_path' from source: unknown 30529 1726882702.30691: variable 'lsr_test' from source: include params 30529 1726882702.30851: variable 'lsr_test' from source: include params 30529 1726882702.30903: variable 'omit' from source: magic vars 30529 1726882702.31002: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.31010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.31019: variable 'omit' from source: magic vars 30529 1726882702.31176: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.31183: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.31192: variable 'item' from source: unknown 30529 1726882702.31238: variable 'item' from source: unknown 30529 1726882702.31259: variable 'item' from source: unknown 30529 1726882702.31309: variable 'item' from source: unknown 30529 1726882702.31437: dumping result to json 30529 1726882702.31440: done dumping result, returning 30529 1726882702.31442: done running TaskExecutor() for managed_node1/TASK: Test [12673a56-9f93-b0f1-edc0-0000000020b1] 30529 1726882702.31444: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b1 30529 1726882702.31477: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b1 30529 1726882702.31481: WORKER PROCESS EXITING 30529 1726882702.31559: no more pending results, returning what we have 30529 1726882702.31564: in VariableManager get_vars() 30529 1726882702.31600: Calling all_inventory to load vars for managed_node1 30529 1726882702.31602: Calling groups_inventory to load vars for managed_node1 30529 1726882702.31605: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.31615: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.31618: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.31620: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.32325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.33155: done with get_vars() 30529 1726882702.33168: variable 'ansible_search_path' from source: unknown 30529 1726882702.33169: variable 'ansible_search_path' from source: unknown 30529 1726882702.33195: we have included files to process 30529 1726882702.33196: generating all_blocks data 30529 1726882702.33198: done generating all_blocks data 30529 1726882702.33202: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882702.33202: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882702.33204: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30529 1726882702.33279: done processing included file 30529 1726882702.33281: iterating over new_blocks loaded from include file 30529 1726882702.33282: in VariableManager get_vars() 30529 1726882702.33294: done with get_vars() 30529 1726882702.33295: filtering new block on tags 30529 1726882702.33312: done filtering new block on tags 30529 1726882702.33314: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node1 => (item=tasks/remove+down_profile.yml) 30529 1726882702.33317: extending task lists for all hosts with included blocks 30529 1726882702.33820: done extending task lists 30529 1726882702.33822: done processing included files 30529 1726882702.33822: results queue empty 30529 1726882702.33822: checking for any_errors_fatal 30529 1726882702.33823: done checking for any_errors_fatal 30529 1726882702.33824: checking for max_fail_percentage 30529 1726882702.33825: done checking for max_fail_percentage 30529 1726882702.33825: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.33826: done checking to see if all hosts have failed 30529 1726882702.33826: getting the remaining hosts for this loop 30529 1726882702.33827: done getting the remaining hosts for this loop 30529 1726882702.33828: getting the next task for host managed_node1 30529 1726882702.33831: done getting next task for host managed_node1 30529 1726882702.33833: ^ task is: TASK: Include network role 30529 1726882702.33835: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.33837: getting variables 30529 1726882702.33837: in VariableManager get_vars() 30529 1726882702.33844: Calling all_inventory to load vars for managed_node1 30529 1726882702.33846: Calling groups_inventory to load vars for managed_node1 30529 1726882702.33847: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.33851: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.33852: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.33854: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.38482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.39308: done with get_vars() 30529 1726882702.39326: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:38:22 -0400 (0:00:00.092) 0:01:56.419 ****** 30529 1726882702.39379: entering _queue_task() for managed_node1/include_role 30529 1726882702.39652: worker is 1 (out of 1 available) 30529 1726882702.39667: exiting _queue_task() for managed_node1/include_role 30529 1726882702.39681: done queuing things up, now waiting for results queue to drain 30529 1726882702.39683: waiting for pending results... 30529 1726882702.39870: running TaskExecutor() for managed_node1/TASK: Include network role 30529 1726882702.39968: in run() - task 12673a56-9f93-b0f1-edc0-000000002612 30529 1726882702.39981: variable 'ansible_search_path' from source: unknown 30529 1726882702.39985: variable 'ansible_search_path' from source: unknown 30529 1726882702.40022: calling self._execute() 30529 1726882702.40095: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.40101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.40110: variable 'omit' from source: magic vars 30529 1726882702.40397: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.40408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.40413: _execute() done 30529 1726882702.40417: dumping result to json 30529 1726882702.40420: done dumping result, returning 30529 1726882702.40427: done running TaskExecutor() for managed_node1/TASK: Include network role [12673a56-9f93-b0f1-edc0-000000002612] 30529 1726882702.40433: sending task result for task 12673a56-9f93-b0f1-edc0-000000002612 30529 1726882702.40530: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002612 30529 1726882702.40532: WORKER PROCESS EXITING 30529 1726882702.40582: no more pending results, returning what we have 30529 1726882702.40587: in VariableManager get_vars() 30529 1726882702.40635: Calling all_inventory to load vars for managed_node1 30529 1726882702.40637: Calling groups_inventory to load vars for managed_node1 30529 1726882702.40641: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.40652: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.40655: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.40657: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.41413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.42260: done with get_vars() 30529 1726882702.42272: variable 'ansible_search_path' from source: unknown 30529 1726882702.42273: variable 'ansible_search_path' from source: unknown 30529 1726882702.42361: variable 'omit' from source: magic vars 30529 1726882702.42391: variable 'omit' from source: magic vars 30529 1726882702.42402: variable 'omit' from source: magic vars 30529 1726882702.42405: we have included files to process 30529 1726882702.42406: generating all_blocks data 30529 1726882702.42407: done generating all_blocks data 30529 1726882702.42408: processing included file: fedora.linux_system_roles.network 30529 1726882702.42422: in VariableManager get_vars() 30529 1726882702.42432: done with get_vars() 30529 1726882702.42449: in VariableManager get_vars() 30529 1726882702.42460: done with get_vars() 30529 1726882702.42485: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30529 1726882702.42559: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30529 1726882702.42605: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30529 1726882702.42864: in VariableManager get_vars() 30529 1726882702.42877: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882702.44067: iterating over new_blocks loaded from include file 30529 1726882702.44069: in VariableManager get_vars() 30529 1726882702.44080: done with get_vars() 30529 1726882702.44082: filtering new block on tags 30529 1726882702.44275: done filtering new block on tags 30529 1726882702.44278: in VariableManager get_vars() 30529 1726882702.44290: done with get_vars() 30529 1726882702.44291: filtering new block on tags 30529 1726882702.44303: done filtering new block on tags 30529 1726882702.44304: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 30529 1726882702.44308: extending task lists for all hosts with included blocks 30529 1726882702.44368: done extending task lists 30529 1726882702.44369: done processing included files 30529 1726882702.44369: results queue empty 30529 1726882702.44370: checking for any_errors_fatal 30529 1726882702.44372: done checking for any_errors_fatal 30529 1726882702.44373: checking for max_fail_percentage 30529 1726882702.44374: done checking for max_fail_percentage 30529 1726882702.44374: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.44375: done checking to see if all hosts have failed 30529 1726882702.44375: getting the remaining hosts for this loop 30529 1726882702.44376: done getting the remaining hosts for this loop 30529 1726882702.44378: getting the next task for host managed_node1 30529 1726882702.44381: done getting next task for host managed_node1 30529 1726882702.44382: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882702.44384: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.44392: getting variables 30529 1726882702.44394: in VariableManager get_vars() 30529 1726882702.44403: Calling all_inventory to load vars for managed_node1 30529 1726882702.44405: Calling groups_inventory to load vars for managed_node1 30529 1726882702.44406: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.44409: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.44411: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.44412: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.45026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.45854: done with get_vars() 30529 1726882702.45868: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:22 -0400 (0:00:00.065) 0:01:56.485 ****** 30529 1726882702.45918: entering _queue_task() for managed_node1/include_tasks 30529 1726882702.46164: worker is 1 (out of 1 available) 30529 1726882702.46177: exiting _queue_task() for managed_node1/include_tasks 30529 1726882702.46191: done queuing things up, now waiting for results queue to drain 30529 1726882702.46194: waiting for pending results... 30529 1726882702.46375: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30529 1726882702.46467: in run() - task 12673a56-9f93-b0f1-edc0-000000002694 30529 1726882702.46479: variable 'ansible_search_path' from source: unknown 30529 1726882702.46482: variable 'ansible_search_path' from source: unknown 30529 1726882702.46514: calling self._execute() 30529 1726882702.46586: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.46590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.46602: variable 'omit' from source: magic vars 30529 1726882702.46877: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.46886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.46896: _execute() done 30529 1726882702.46900: dumping result to json 30529 1726882702.46903: done dumping result, returning 30529 1726882702.46909: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-b0f1-edc0-000000002694] 30529 1726882702.46914: sending task result for task 12673a56-9f93-b0f1-edc0-000000002694 30529 1726882702.47000: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002694 30529 1726882702.47004: WORKER PROCESS EXITING 30529 1726882702.47053: no more pending results, returning what we have 30529 1726882702.47058: in VariableManager get_vars() 30529 1726882702.47107: Calling all_inventory to load vars for managed_node1 30529 1726882702.47110: Calling groups_inventory to load vars for managed_node1 30529 1726882702.47113: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.47124: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.47127: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.47129: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.47959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.48815: done with get_vars() 30529 1726882702.48829: variable 'ansible_search_path' from source: unknown 30529 1726882702.48830: variable 'ansible_search_path' from source: unknown 30529 1726882702.48854: we have included files to process 30529 1726882702.48855: generating all_blocks data 30529 1726882702.48856: done generating all_blocks data 30529 1726882702.48858: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882702.48859: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882702.48860: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30529 1726882702.49227: done processing included file 30529 1726882702.49229: iterating over new_blocks loaded from include file 30529 1726882702.49229: in VariableManager get_vars() 30529 1726882702.49245: done with get_vars() 30529 1726882702.49246: filtering new block on tags 30529 1726882702.49266: done filtering new block on tags 30529 1726882702.49268: in VariableManager get_vars() 30529 1726882702.49282: done with get_vars() 30529 1726882702.49283: filtering new block on tags 30529 1726882702.49311: done filtering new block on tags 30529 1726882702.49312: in VariableManager get_vars() 30529 1726882702.49327: done with get_vars() 30529 1726882702.49328: filtering new block on tags 30529 1726882702.49350: done filtering new block on tags 30529 1726882702.49351: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 30529 1726882702.49355: extending task lists for all hosts with included blocks 30529 1726882702.50278: done extending task lists 30529 1726882702.50279: done processing included files 30529 1726882702.50281: results queue empty 30529 1726882702.50282: checking for any_errors_fatal 30529 1726882702.50284: done checking for any_errors_fatal 30529 1726882702.50285: checking for max_fail_percentage 30529 1726882702.50285: done checking for max_fail_percentage 30529 1726882702.50286: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.50287: done checking to see if all hosts have failed 30529 1726882702.50287: getting the remaining hosts for this loop 30529 1726882702.50288: done getting the remaining hosts for this loop 30529 1726882702.50290: getting the next task for host managed_node1 30529 1726882702.50295: done getting next task for host managed_node1 30529 1726882702.50297: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882702.50300: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.50307: getting variables 30529 1726882702.50308: in VariableManager get_vars() 30529 1726882702.50317: Calling all_inventory to load vars for managed_node1 30529 1726882702.50319: Calling groups_inventory to load vars for managed_node1 30529 1726882702.50320: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.50323: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.50325: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.50327: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.50930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.51757: done with get_vars() 30529 1726882702.51771: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:22 -0400 (0:00:00.059) 0:01:56.544 ****** 30529 1726882702.51824: entering _queue_task() for managed_node1/setup 30529 1726882702.52081: worker is 1 (out of 1 available) 30529 1726882702.52096: exiting _queue_task() for managed_node1/setup 30529 1726882702.52110: done queuing things up, now waiting for results queue to drain 30529 1726882702.52112: waiting for pending results... 30529 1726882702.52296: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30529 1726882702.52394: in run() - task 12673a56-9f93-b0f1-edc0-0000000026eb 30529 1726882702.52410: variable 'ansible_search_path' from source: unknown 30529 1726882702.52415: variable 'ansible_search_path' from source: unknown 30529 1726882702.52443: calling self._execute() 30529 1726882702.52517: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.52521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.52528: variable 'omit' from source: magic vars 30529 1726882702.52807: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.52818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.52961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882702.54528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882702.54568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882702.54601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882702.54630: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882702.54647: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882702.54706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882702.54727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882702.54748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882702.54774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882702.54785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882702.54826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882702.54847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882702.54861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882702.54885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882702.54901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882702.55007: variable '__network_required_facts' from source: role '' defaults 30529 1726882702.55015: variable 'ansible_facts' from source: unknown 30529 1726882702.55428: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30529 1726882702.55432: when evaluation is False, skipping this task 30529 1726882702.55435: _execute() done 30529 1726882702.55437: dumping result to json 30529 1726882702.55440: done dumping result, returning 30529 1726882702.55446: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-b0f1-edc0-0000000026eb] 30529 1726882702.55450: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026eb 30529 1726882702.55533: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026eb 30529 1726882702.55536: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882702.55575: no more pending results, returning what we have 30529 1726882702.55578: results queue empty 30529 1726882702.55579: checking for any_errors_fatal 30529 1726882702.55581: done checking for any_errors_fatal 30529 1726882702.55581: checking for max_fail_percentage 30529 1726882702.55583: done checking for max_fail_percentage 30529 1726882702.55584: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.55585: done checking to see if all hosts have failed 30529 1726882702.55585: getting the remaining hosts for this loop 30529 1726882702.55587: done getting the remaining hosts for this loop 30529 1726882702.55590: getting the next task for host managed_node1 30529 1726882702.55604: done getting next task for host managed_node1 30529 1726882702.55607: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882702.55613: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.55642: getting variables 30529 1726882702.55643: in VariableManager get_vars() 30529 1726882702.55686: Calling all_inventory to load vars for managed_node1 30529 1726882702.55688: Calling groups_inventory to load vars for managed_node1 30529 1726882702.55691: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.55705: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.55708: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.55716: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.56557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.57435: done with get_vars() 30529 1726882702.57452: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:22 -0400 (0:00:00.056) 0:01:56.601 ****** 30529 1726882702.57519: entering _queue_task() for managed_node1/stat 30529 1726882702.57741: worker is 1 (out of 1 available) 30529 1726882702.57754: exiting _queue_task() for managed_node1/stat 30529 1726882702.57766: done queuing things up, now waiting for results queue to drain 30529 1726882702.57768: waiting for pending results... 30529 1726882702.57950: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 30529 1726882702.58036: in run() - task 12673a56-9f93-b0f1-edc0-0000000026ed 30529 1726882702.58049: variable 'ansible_search_path' from source: unknown 30529 1726882702.58053: variable 'ansible_search_path' from source: unknown 30529 1726882702.58078: calling self._execute() 30529 1726882702.58158: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.58162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.58172: variable 'omit' from source: magic vars 30529 1726882702.58443: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.58452: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.58567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882702.58755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882702.58786: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882702.58815: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882702.58841: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882702.58906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882702.58923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882702.58941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882702.58957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882702.59030: variable '__network_is_ostree' from source: set_fact 30529 1726882702.59034: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882702.59037: when evaluation is False, skipping this task 30529 1726882702.59040: _execute() done 30529 1726882702.59043: dumping result to json 30529 1726882702.59046: done dumping result, returning 30529 1726882702.59053: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-b0f1-edc0-0000000026ed] 30529 1726882702.59058: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026ed 30529 1726882702.59139: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026ed 30529 1726882702.59142: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882702.59230: no more pending results, returning what we have 30529 1726882702.59233: results queue empty 30529 1726882702.59234: checking for any_errors_fatal 30529 1726882702.59239: done checking for any_errors_fatal 30529 1726882702.59239: checking for max_fail_percentage 30529 1726882702.59241: done checking for max_fail_percentage 30529 1726882702.59242: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.59242: done checking to see if all hosts have failed 30529 1726882702.59243: getting the remaining hosts for this loop 30529 1726882702.59245: done getting the remaining hosts for this loop 30529 1726882702.59248: getting the next task for host managed_node1 30529 1726882702.59255: done getting next task for host managed_node1 30529 1726882702.59258: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882702.59263: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.59285: getting variables 30529 1726882702.59286: in VariableManager get_vars() 30529 1726882702.59324: Calling all_inventory to load vars for managed_node1 30529 1726882702.59327: Calling groups_inventory to load vars for managed_node1 30529 1726882702.59329: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.59337: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.59339: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.59342: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.60079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.61052: done with get_vars() 30529 1726882702.61066: done getting variables 30529 1726882702.61109: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:22 -0400 (0:00:00.036) 0:01:56.637 ****** 30529 1726882702.61137: entering _queue_task() for managed_node1/set_fact 30529 1726882702.61363: worker is 1 (out of 1 available) 30529 1726882702.61376: exiting _queue_task() for managed_node1/set_fact 30529 1726882702.61392: done queuing things up, now waiting for results queue to drain 30529 1726882702.61395: waiting for pending results... 30529 1726882702.61562: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30529 1726882702.61658: in run() - task 12673a56-9f93-b0f1-edc0-0000000026ee 30529 1726882702.61670: variable 'ansible_search_path' from source: unknown 30529 1726882702.61674: variable 'ansible_search_path' from source: unknown 30529 1726882702.61703: calling self._execute() 30529 1726882702.61776: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.61780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.61791: variable 'omit' from source: magic vars 30529 1726882702.62061: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.62071: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.62185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882702.62373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882702.62412: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882702.62439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882702.62465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882702.62529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882702.62547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882702.62565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882702.62582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882702.62653: variable '__network_is_ostree' from source: set_fact 30529 1726882702.62659: Evaluated conditional (not __network_is_ostree is defined): False 30529 1726882702.62662: when evaluation is False, skipping this task 30529 1726882702.62664: _execute() done 30529 1726882702.62666: dumping result to json 30529 1726882702.62671: done dumping result, returning 30529 1726882702.62680: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-b0f1-edc0-0000000026ee] 30529 1726882702.62682: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026ee 30529 1726882702.62764: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026ee 30529 1726882702.62767: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30529 1726882702.62854: no more pending results, returning what we have 30529 1726882702.62857: results queue empty 30529 1726882702.62858: checking for any_errors_fatal 30529 1726882702.62864: done checking for any_errors_fatal 30529 1726882702.62865: checking for max_fail_percentage 30529 1726882702.62866: done checking for max_fail_percentage 30529 1726882702.62867: checking to see if all hosts have failed and the running result is not ok 30529 1726882702.62868: done checking to see if all hosts have failed 30529 1726882702.62868: getting the remaining hosts for this loop 30529 1726882702.62870: done getting the remaining hosts for this loop 30529 1726882702.62873: getting the next task for host managed_node1 30529 1726882702.62885: done getting next task for host managed_node1 30529 1726882702.62891: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882702.62901: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882702.62925: getting variables 30529 1726882702.62926: in VariableManager get_vars() 30529 1726882702.62963: Calling all_inventory to load vars for managed_node1 30529 1726882702.62965: Calling groups_inventory to load vars for managed_node1 30529 1726882702.62967: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882702.62975: Calling all_plugins_play to load vars for managed_node1 30529 1726882702.62978: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882702.62980: Calling groups_plugins_play to load vars for managed_node1 30529 1726882702.63753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882702.64616: done with get_vars() 30529 1726882702.64634: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:22 -0400 (0:00:00.035) 0:01:56.673 ****** 30529 1726882702.64705: entering _queue_task() for managed_node1/service_facts 30529 1726882702.64952: worker is 1 (out of 1 available) 30529 1726882702.64964: exiting _queue_task() for managed_node1/service_facts 30529 1726882702.64977: done queuing things up, now waiting for results queue to drain 30529 1726882702.64979: waiting for pending results... 30529 1726882702.65171: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 30529 1726882702.65274: in run() - task 12673a56-9f93-b0f1-edc0-0000000026f0 30529 1726882702.65290: variable 'ansible_search_path' from source: unknown 30529 1726882702.65295: variable 'ansible_search_path' from source: unknown 30529 1726882702.65325: calling self._execute() 30529 1726882702.65400: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.65405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.65413: variable 'omit' from source: magic vars 30529 1726882702.65701: variable 'ansible_distribution_major_version' from source: facts 30529 1726882702.65711: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882702.65717: variable 'omit' from source: magic vars 30529 1726882702.65768: variable 'omit' from source: magic vars 30529 1726882702.65791: variable 'omit' from source: magic vars 30529 1726882702.65825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882702.65852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882702.65870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882702.65884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882702.65897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882702.65921: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882702.65924: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.65927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.65998: Set connection var ansible_shell_executable to /bin/sh 30529 1726882702.66003: Set connection var ansible_pipelining to False 30529 1726882702.66005: Set connection var ansible_shell_type to sh 30529 1726882702.66014: Set connection var ansible_timeout to 10 30529 1726882702.66017: Set connection var ansible_connection to ssh 30529 1726882702.66022: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882702.66038: variable 'ansible_shell_executable' from source: unknown 30529 1726882702.66041: variable 'ansible_connection' from source: unknown 30529 1726882702.66044: variable 'ansible_module_compression' from source: unknown 30529 1726882702.66046: variable 'ansible_shell_type' from source: unknown 30529 1726882702.66048: variable 'ansible_shell_executable' from source: unknown 30529 1726882702.66050: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882702.66055: variable 'ansible_pipelining' from source: unknown 30529 1726882702.66057: variable 'ansible_timeout' from source: unknown 30529 1726882702.66061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882702.66205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882702.66215: variable 'omit' from source: magic vars 30529 1726882702.66219: starting attempt loop 30529 1726882702.66221: running the handler 30529 1726882702.66235: _low_level_execute_command(): starting 30529 1726882702.66242: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882702.66744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.66747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.66751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.66808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.66811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.66872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.68538: stdout chunk (state=3): >>>/root <<< 30529 1726882702.68659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.68663: stdout chunk (state=3): >>><<< 30529 1726882702.68671: stderr chunk (state=3): >>><<< 30529 1726882702.68686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.68703: _low_level_execute_command(): starting 30529 1726882702.68706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188 `" && echo ansible-tmp-1726882702.686864-35743-34271045833188="` echo /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188 `" ) && sleep 0' 30529 1726882702.69132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.69135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882702.69137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.69140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.69149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.69196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.69203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.69243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.71099: stdout chunk (state=3): >>>ansible-tmp-1726882702.686864-35743-34271045833188=/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188 <<< 30529 1726882702.71203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.71229: stderr chunk (state=3): >>><<< 30529 1726882702.71232: stdout chunk (state=3): >>><<< 30529 1726882702.71247: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882702.686864-35743-34271045833188=/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.71282: variable 'ansible_module_compression' from source: unknown 30529 1726882702.71321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30529 1726882702.71351: variable 'ansible_facts' from source: unknown 30529 1726882702.71411: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py 30529 1726882702.71501: Sending initial data 30529 1726882702.71504: Sent initial data (160 bytes) 30529 1726882702.71933: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.71936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882702.71938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.71940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.71942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882702.71944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.71992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.72000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.72039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.73543: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882702.73547: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882702.73581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882702.73625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpft8ms51a /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py <<< 30529 1726882702.73627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py" <<< 30529 1726882702.73664: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpft8ms51a" to remote "/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py" <<< 30529 1726882702.74182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.74222: stderr chunk (state=3): >>><<< 30529 1726882702.74225: stdout chunk (state=3): >>><<< 30529 1726882702.74248: done transferring module to remote 30529 1726882702.74256: _low_level_execute_command(): starting 30529 1726882702.74260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/ /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py && sleep 0' 30529 1726882702.74672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.74675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.74677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882702.74679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882702.74684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.74727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882702.74730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.74778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882702.76462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882702.76484: stderr chunk (state=3): >>><<< 30529 1726882702.76490: stdout chunk (state=3): >>><<< 30529 1726882702.76502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882702.76505: _low_level_execute_command(): starting 30529 1726882702.76509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/AnsiballZ_service_facts.py && sleep 0' 30529 1726882702.76905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.76908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.76912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882702.76914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882702.76916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882702.76962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882702.76965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882702.77018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.27517: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30529 1726882704.27540: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30529 1726882704.27550: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 30529 1726882704.27576: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 30529 1726882704.27583: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30529 1726882704.29052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882704.29082: stderr chunk (state=3): >>><<< 30529 1726882704.29085: stdout chunk (state=3): >>><<< 30529 1726882704.29117: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882704.29854: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882704.29862: _low_level_execute_command(): starting 30529 1726882704.29866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882702.686864-35743-34271045833188/ > /dev/null 2>&1 && sleep 0' 30529 1726882704.30320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882704.30324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.30338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.30391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.30397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882704.30400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.30447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.32205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.32231: stderr chunk (state=3): >>><<< 30529 1726882704.32235: stdout chunk (state=3): >>><<< 30529 1726882704.32247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882704.32252: handler run complete 30529 1726882704.32368: variable 'ansible_facts' from source: unknown 30529 1726882704.32459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882704.32734: variable 'ansible_facts' from source: unknown 30529 1726882704.32818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882704.32931: attempt loop complete, returning result 30529 1726882704.32935: _execute() done 30529 1726882704.32938: dumping result to json 30529 1726882704.32972: done dumping result, returning 30529 1726882704.32980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-b0f1-edc0-0000000026f0] 30529 1726882704.32983: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026f0 30529 1726882704.33761: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026f0 30529 1726882704.33765: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882704.33826: no more pending results, returning what we have 30529 1726882704.33828: results queue empty 30529 1726882704.33829: checking for any_errors_fatal 30529 1726882704.33831: done checking for any_errors_fatal 30529 1726882704.33831: checking for max_fail_percentage 30529 1726882704.33832: done checking for max_fail_percentage 30529 1726882704.33833: checking to see if all hosts have failed and the running result is not ok 30529 1726882704.33833: done checking to see if all hosts have failed 30529 1726882704.33834: getting the remaining hosts for this loop 30529 1726882704.33835: done getting the remaining hosts for this loop 30529 1726882704.33837: getting the next task for host managed_node1 30529 1726882704.33841: done getting next task for host managed_node1 30529 1726882704.33844: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882704.33849: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882704.33856: getting variables 30529 1726882704.33857: in VariableManager get_vars() 30529 1726882704.33880: Calling all_inventory to load vars for managed_node1 30529 1726882704.33882: Calling groups_inventory to load vars for managed_node1 30529 1726882704.33883: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882704.33891: Calling all_plugins_play to load vars for managed_node1 30529 1726882704.33895: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882704.33897: Calling groups_plugins_play to load vars for managed_node1 30529 1726882704.34591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882704.35455: done with get_vars() 30529 1726882704.35472: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:24 -0400 (0:00:01.708) 0:01:58.381 ****** 30529 1726882704.35546: entering _queue_task() for managed_node1/package_facts 30529 1726882704.35800: worker is 1 (out of 1 available) 30529 1726882704.35814: exiting _queue_task() for managed_node1/package_facts 30529 1726882704.35828: done queuing things up, now waiting for results queue to drain 30529 1726882704.35830: waiting for pending results... 30529 1726882704.36013: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 30529 1726882704.36121: in run() - task 12673a56-9f93-b0f1-edc0-0000000026f1 30529 1726882704.36135: variable 'ansible_search_path' from source: unknown 30529 1726882704.36139: variable 'ansible_search_path' from source: unknown 30529 1726882704.36166: calling self._execute() 30529 1726882704.36248: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882704.36252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882704.36261: variable 'omit' from source: magic vars 30529 1726882704.36548: variable 'ansible_distribution_major_version' from source: facts 30529 1726882704.36559: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882704.36565: variable 'omit' from source: magic vars 30529 1726882704.36624: variable 'omit' from source: magic vars 30529 1726882704.36645: variable 'omit' from source: magic vars 30529 1726882704.36677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882704.36710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882704.36724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882704.36738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882704.36748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882704.36771: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882704.36774: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882704.36776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882704.36851: Set connection var ansible_shell_executable to /bin/sh 30529 1726882704.36855: Set connection var ansible_pipelining to False 30529 1726882704.36857: Set connection var ansible_shell_type to sh 30529 1726882704.36865: Set connection var ansible_timeout to 10 30529 1726882704.36868: Set connection var ansible_connection to ssh 30529 1726882704.36872: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882704.36890: variable 'ansible_shell_executable' from source: unknown 30529 1726882704.36896: variable 'ansible_connection' from source: unknown 30529 1726882704.36899: variable 'ansible_module_compression' from source: unknown 30529 1726882704.36901: variable 'ansible_shell_type' from source: unknown 30529 1726882704.36903: variable 'ansible_shell_executable' from source: unknown 30529 1726882704.36906: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882704.36910: variable 'ansible_pipelining' from source: unknown 30529 1726882704.36912: variable 'ansible_timeout' from source: unknown 30529 1726882704.36916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882704.37057: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882704.37068: variable 'omit' from source: magic vars 30529 1726882704.37072: starting attempt loop 30529 1726882704.37076: running the handler 30529 1726882704.37088: _low_level_execute_command(): starting 30529 1726882704.37099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882704.37611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.37618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882704.37621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.37667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.37670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882704.37672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.37721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.39272: stdout chunk (state=3): >>>/root <<< 30529 1726882704.39375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.39405: stderr chunk (state=3): >>><<< 30529 1726882704.39408: stdout chunk (state=3): >>><<< 30529 1726882704.39431: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882704.39439: _low_level_execute_command(): starting 30529 1726882704.39445: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988 `" && echo ansible-tmp-1726882704.3942685-35757-179497646045988="` echo /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988 `" ) && sleep 0' 30529 1726882704.39867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.39870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882704.39872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882704.39882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.39928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.39931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.39980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.41805: stdout chunk (state=3): >>>ansible-tmp-1726882704.3942685-35757-179497646045988=/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988 <<< 30529 1726882704.41908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.41935: stderr chunk (state=3): >>><<< 30529 1726882704.41938: stdout chunk (state=3): >>><<< 30529 1726882704.41950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882704.3942685-35757-179497646045988=/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882704.41986: variable 'ansible_module_compression' from source: unknown 30529 1726882704.42025: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30529 1726882704.42078: variable 'ansible_facts' from source: unknown 30529 1726882704.42196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py 30529 1726882704.42298: Sending initial data 30529 1726882704.42302: Sent initial data (162 bytes) 30529 1726882704.42724: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882704.42727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882704.42729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.42732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.42734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.42780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.42784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.42830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.44334: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882704.44339: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882704.44372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882704.44419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8cypl1l1 /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py <<< 30529 1726882704.44422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py" <<< 30529 1726882704.44456: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp8cypl1l1" to remote "/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py" <<< 30529 1726882704.45458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.45499: stderr chunk (state=3): >>><<< 30529 1726882704.45502: stdout chunk (state=3): >>><<< 30529 1726882704.45539: done transferring module to remote 30529 1726882704.45547: _low_level_execute_command(): starting 30529 1726882704.45550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/ /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py && sleep 0' 30529 1726882704.45966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882704.45969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882704.45971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.45974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.45979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.46030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882704.46033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.46076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.47771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.47797: stderr chunk (state=3): >>><<< 30529 1726882704.47800: stdout chunk (state=3): >>><<< 30529 1726882704.47812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882704.47815: _low_level_execute_command(): starting 30529 1726882704.47818: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/AnsiballZ_package_facts.py && sleep 0' 30529 1726882704.48227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.48231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.48233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882704.48236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882704.48238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.48283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.48292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.48338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.92121: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30529 1726882704.92138: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30529 1726882704.92178: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30529 1726882704.92197: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30529 1726882704.92230: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30529 1726882704.92259: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30529 1726882704.92264: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30529 1726882704.92300: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30529 1726882704.94103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882704.94129: stderr chunk (state=3): >>><<< 30529 1726882704.94132: stdout chunk (state=3): >>><<< 30529 1726882704.94172: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882704.95548: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882704.95564: _low_level_execute_command(): starting 30529 1726882704.95567: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882704.3942685-35757-179497646045988/ > /dev/null 2>&1 && sleep 0' 30529 1726882704.96032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882704.96036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.96038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882704.96041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882704.96043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882704.96098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882704.96105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882704.96108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882704.96147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882704.97969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882704.98000: stderr chunk (state=3): >>><<< 30529 1726882704.98003: stdout chunk (state=3): >>><<< 30529 1726882704.98016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882704.98022: handler run complete 30529 1726882704.98490: variable 'ansible_facts' from source: unknown 30529 1726882704.98822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882704.99861: variable 'ansible_facts' from source: unknown 30529 1726882705.00106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.00479: attempt loop complete, returning result 30529 1726882705.00490: _execute() done 30529 1726882705.00495: dumping result to json 30529 1726882705.00606: done dumping result, returning 30529 1726882705.00614: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-b0f1-edc0-0000000026f1] 30529 1726882705.00617: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026f1 30529 1726882705.01928: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026f1 30529 1726882705.01932: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882705.02030: no more pending results, returning what we have 30529 1726882705.02032: results queue empty 30529 1726882705.02033: checking for any_errors_fatal 30529 1726882705.02039: done checking for any_errors_fatal 30529 1726882705.02039: checking for max_fail_percentage 30529 1726882705.02040: done checking for max_fail_percentage 30529 1726882705.02041: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.02042: done checking to see if all hosts have failed 30529 1726882705.02042: getting the remaining hosts for this loop 30529 1726882705.02043: done getting the remaining hosts for this loop 30529 1726882705.02046: getting the next task for host managed_node1 30529 1726882705.02051: done getting next task for host managed_node1 30529 1726882705.02053: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882705.02057: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.02065: getting variables 30529 1726882705.02066: in VariableManager get_vars() 30529 1726882705.02095: Calling all_inventory to load vars for managed_node1 30529 1726882705.02097: Calling groups_inventory to load vars for managed_node1 30529 1726882705.02099: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.02105: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.02107: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.02109: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.02800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.03664: done with get_vars() 30529 1726882705.03685: done getting variables 30529 1726882705.03731: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:25 -0400 (0:00:00.682) 0:01:59.063 ****** 30529 1726882705.03756: entering _queue_task() for managed_node1/debug 30529 1726882705.04003: worker is 1 (out of 1 available) 30529 1726882705.04017: exiting _queue_task() for managed_node1/debug 30529 1726882705.04030: done queuing things up, now waiting for results queue to drain 30529 1726882705.04031: waiting for pending results... 30529 1726882705.04222: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 30529 1726882705.04309: in run() - task 12673a56-9f93-b0f1-edc0-000000002695 30529 1726882705.04322: variable 'ansible_search_path' from source: unknown 30529 1726882705.04327: variable 'ansible_search_path' from source: unknown 30529 1726882705.04353: calling self._execute() 30529 1726882705.04439: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.04443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.04451: variable 'omit' from source: magic vars 30529 1726882705.04731: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.04741: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.04747: variable 'omit' from source: magic vars 30529 1726882705.04792: variable 'omit' from source: magic vars 30529 1726882705.04860: variable 'network_provider' from source: set_fact 30529 1726882705.04874: variable 'omit' from source: magic vars 30529 1726882705.04910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882705.04938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882705.04954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882705.04967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882705.04977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882705.05003: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882705.05006: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.05009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.05078: Set connection var ansible_shell_executable to /bin/sh 30529 1726882705.05082: Set connection var ansible_pipelining to False 30529 1726882705.05084: Set connection var ansible_shell_type to sh 30529 1726882705.05095: Set connection var ansible_timeout to 10 30529 1726882705.05098: Set connection var ansible_connection to ssh 30529 1726882705.05102: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882705.05118: variable 'ansible_shell_executable' from source: unknown 30529 1726882705.05121: variable 'ansible_connection' from source: unknown 30529 1726882705.05124: variable 'ansible_module_compression' from source: unknown 30529 1726882705.05126: variable 'ansible_shell_type' from source: unknown 30529 1726882705.05131: variable 'ansible_shell_executable' from source: unknown 30529 1726882705.05133: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.05135: variable 'ansible_pipelining' from source: unknown 30529 1726882705.05138: variable 'ansible_timeout' from source: unknown 30529 1726882705.05140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.05239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882705.05248: variable 'omit' from source: magic vars 30529 1726882705.05252: starting attempt loop 30529 1726882705.05256: running the handler 30529 1726882705.05314: handler run complete 30529 1726882705.05324: attempt loop complete, returning result 30529 1726882705.05327: _execute() done 30529 1726882705.05330: dumping result to json 30529 1726882705.05333: done dumping result, returning 30529 1726882705.05338: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-b0f1-edc0-000000002695] 30529 1726882705.05343: sending task result for task 12673a56-9f93-b0f1-edc0-000000002695 30529 1726882705.05427: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002695 30529 1726882705.05429: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 30529 1726882705.05498: no more pending results, returning what we have 30529 1726882705.05501: results queue empty 30529 1726882705.05502: checking for any_errors_fatal 30529 1726882705.05512: done checking for any_errors_fatal 30529 1726882705.05512: checking for max_fail_percentage 30529 1726882705.05514: done checking for max_fail_percentage 30529 1726882705.05515: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.05515: done checking to see if all hosts have failed 30529 1726882705.05516: getting the remaining hosts for this loop 30529 1726882705.05518: done getting the remaining hosts for this loop 30529 1726882705.05521: getting the next task for host managed_node1 30529 1726882705.05529: done getting next task for host managed_node1 30529 1726882705.05532: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882705.05536: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.05549: getting variables 30529 1726882705.05550: in VariableManager get_vars() 30529 1726882705.05586: Calling all_inventory to load vars for managed_node1 30529 1726882705.05591: Calling groups_inventory to load vars for managed_node1 30529 1726882705.05598: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.05607: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.05610: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.05613: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.06489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.07357: done with get_vars() 30529 1726882705.07372: done getting variables 30529 1726882705.07418: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:25 -0400 (0:00:00.036) 0:01:59.100 ****** 30529 1726882705.07449: entering _queue_task() for managed_node1/fail 30529 1726882705.07685: worker is 1 (out of 1 available) 30529 1726882705.07705: exiting _queue_task() for managed_node1/fail 30529 1726882705.07719: done queuing things up, now waiting for results queue to drain 30529 1726882705.07720: waiting for pending results... 30529 1726882705.07906: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30529 1726882705.07999: in run() - task 12673a56-9f93-b0f1-edc0-000000002696 30529 1726882705.08012: variable 'ansible_search_path' from source: unknown 30529 1726882705.08017: variable 'ansible_search_path' from source: unknown 30529 1726882705.08043: calling self._execute() 30529 1726882705.08122: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.08126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.08134: variable 'omit' from source: magic vars 30529 1726882705.08417: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.08427: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.08511: variable 'network_state' from source: role '' defaults 30529 1726882705.08520: Evaluated conditional (network_state != {}): False 30529 1726882705.08523: when evaluation is False, skipping this task 30529 1726882705.08526: _execute() done 30529 1726882705.08529: dumping result to json 30529 1726882705.08531: done dumping result, returning 30529 1726882705.08538: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-b0f1-edc0-000000002696] 30529 1726882705.08542: sending task result for task 12673a56-9f93-b0f1-edc0-000000002696 30529 1726882705.08633: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002696 30529 1726882705.08636: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882705.08680: no more pending results, returning what we have 30529 1726882705.08684: results queue empty 30529 1726882705.08685: checking for any_errors_fatal 30529 1726882705.08698: done checking for any_errors_fatal 30529 1726882705.08699: checking for max_fail_percentage 30529 1726882705.08700: done checking for max_fail_percentage 30529 1726882705.08701: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.08702: done checking to see if all hosts have failed 30529 1726882705.08703: getting the remaining hosts for this loop 30529 1726882705.08704: done getting the remaining hosts for this loop 30529 1726882705.08707: getting the next task for host managed_node1 30529 1726882705.08716: done getting next task for host managed_node1 30529 1726882705.08720: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882705.08725: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.08748: getting variables 30529 1726882705.08750: in VariableManager get_vars() 30529 1726882705.08790: Calling all_inventory to load vars for managed_node1 30529 1726882705.08797: Calling groups_inventory to load vars for managed_node1 30529 1726882705.08800: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.08809: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.08812: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.08815: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.09597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.10467: done with get_vars() 30529 1726882705.10484: done getting variables 30529 1726882705.10531: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:25 -0400 (0:00:00.031) 0:01:59.131 ****** 30529 1726882705.10557: entering _queue_task() for managed_node1/fail 30529 1726882705.10803: worker is 1 (out of 1 available) 30529 1726882705.10817: exiting _queue_task() for managed_node1/fail 30529 1726882705.10830: done queuing things up, now waiting for results queue to drain 30529 1726882705.10832: waiting for pending results... 30529 1726882705.11012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30529 1726882705.11097: in run() - task 12673a56-9f93-b0f1-edc0-000000002697 30529 1726882705.11110: variable 'ansible_search_path' from source: unknown 30529 1726882705.11114: variable 'ansible_search_path' from source: unknown 30529 1726882705.11141: calling self._execute() 30529 1726882705.11222: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.11226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.11233: variable 'omit' from source: magic vars 30529 1726882705.11514: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.11523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.11609: variable 'network_state' from source: role '' defaults 30529 1726882705.11615: Evaluated conditional (network_state != {}): False 30529 1726882705.11618: when evaluation is False, skipping this task 30529 1726882705.11621: _execute() done 30529 1726882705.11624: dumping result to json 30529 1726882705.11628: done dumping result, returning 30529 1726882705.11635: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-b0f1-edc0-000000002697] 30529 1726882705.11640: sending task result for task 12673a56-9f93-b0f1-edc0-000000002697 30529 1726882705.11730: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002697 30529 1726882705.11733: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882705.11777: no more pending results, returning what we have 30529 1726882705.11781: results queue empty 30529 1726882705.11782: checking for any_errors_fatal 30529 1726882705.11791: done checking for any_errors_fatal 30529 1726882705.11792: checking for max_fail_percentage 30529 1726882705.11795: done checking for max_fail_percentage 30529 1726882705.11796: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.11797: done checking to see if all hosts have failed 30529 1726882705.11798: getting the remaining hosts for this loop 30529 1726882705.11800: done getting the remaining hosts for this loop 30529 1726882705.11803: getting the next task for host managed_node1 30529 1726882705.11811: done getting next task for host managed_node1 30529 1726882705.11815: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882705.11819: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.11842: getting variables 30529 1726882705.11844: in VariableManager get_vars() 30529 1726882705.11883: Calling all_inventory to load vars for managed_node1 30529 1726882705.11885: Calling groups_inventory to load vars for managed_node1 30529 1726882705.11890: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.11903: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.11906: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.11909: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.12807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.13658: done with get_vars() 30529 1726882705.13674: done getting variables 30529 1726882705.13719: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:25 -0400 (0:00:00.031) 0:01:59.163 ****** 30529 1726882705.13744: entering _queue_task() for managed_node1/fail 30529 1726882705.13973: worker is 1 (out of 1 available) 30529 1726882705.13992: exiting _queue_task() for managed_node1/fail 30529 1726882705.14007: done queuing things up, now waiting for results queue to drain 30529 1726882705.14009: waiting for pending results... 30529 1726882705.14191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30529 1726882705.14287: in run() - task 12673a56-9f93-b0f1-edc0-000000002698 30529 1726882705.14300: variable 'ansible_search_path' from source: unknown 30529 1726882705.14305: variable 'ansible_search_path' from source: unknown 30529 1726882705.14331: calling self._execute() 30529 1726882705.14407: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.14411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.14420: variable 'omit' from source: magic vars 30529 1726882705.14684: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.14696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.14816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.16318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.16368: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.16396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.16427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.16447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.16505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.16529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.16547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.16573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.16583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.16653: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.16666: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30529 1726882705.16744: variable 'ansible_distribution' from source: facts 30529 1726882705.16747: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.16755: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30529 1726882705.16911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.16927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.16946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.16973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.16983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.17017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.17033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.17049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.17076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.17087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.17118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.17133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.17150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.17176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.17186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.17374: variable 'network_connections' from source: include params 30529 1726882705.17383: variable 'interface' from source: play vars 30529 1726882705.17431: variable 'interface' from source: play vars 30529 1726882705.17440: variable 'network_state' from source: role '' defaults 30529 1726882705.17484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.17592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.17622: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.17643: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.17664: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.17706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.17723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.17745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.17762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.17780: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30529 1726882705.17783: when evaluation is False, skipping this task 30529 1726882705.17786: _execute() done 30529 1726882705.17791: dumping result to json 30529 1726882705.17795: done dumping result, returning 30529 1726882705.17799: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-b0f1-edc0-000000002698] 30529 1726882705.17804: sending task result for task 12673a56-9f93-b0f1-edc0-000000002698 30529 1726882705.17885: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002698 30529 1726882705.17891: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30529 1726882705.17973: no more pending results, returning what we have 30529 1726882705.17976: results queue empty 30529 1726882705.17977: checking for any_errors_fatal 30529 1726882705.17984: done checking for any_errors_fatal 30529 1726882705.17984: checking for max_fail_percentage 30529 1726882705.17986: done checking for max_fail_percentage 30529 1726882705.17990: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.17990: done checking to see if all hosts have failed 30529 1726882705.17991: getting the remaining hosts for this loop 30529 1726882705.17994: done getting the remaining hosts for this loop 30529 1726882705.17998: getting the next task for host managed_node1 30529 1726882705.18006: done getting next task for host managed_node1 30529 1726882705.18009: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882705.18015: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.18038: getting variables 30529 1726882705.18040: in VariableManager get_vars() 30529 1726882705.18080: Calling all_inventory to load vars for managed_node1 30529 1726882705.18082: Calling groups_inventory to load vars for managed_node1 30529 1726882705.18084: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.18100: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.18103: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.18106: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.18902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.19770: done with get_vars() 30529 1726882705.19790: done getting variables 30529 1726882705.19835: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:25 -0400 (0:00:00.061) 0:01:59.224 ****** 30529 1726882705.19859: entering _queue_task() for managed_node1/dnf 30529 1726882705.20113: worker is 1 (out of 1 available) 30529 1726882705.20128: exiting _queue_task() for managed_node1/dnf 30529 1726882705.20143: done queuing things up, now waiting for results queue to drain 30529 1726882705.20144: waiting for pending results... 30529 1726882705.20344: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30529 1726882705.20444: in run() - task 12673a56-9f93-b0f1-edc0-000000002699 30529 1726882705.20457: variable 'ansible_search_path' from source: unknown 30529 1726882705.20460: variable 'ansible_search_path' from source: unknown 30529 1726882705.20490: calling self._execute() 30529 1726882705.20570: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.20574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.20584: variable 'omit' from source: magic vars 30529 1726882705.20865: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.20875: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.21016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.22799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.22844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.22880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.22909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.22930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.22988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.23011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.23029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.23055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.23065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.23146: variable 'ansible_distribution' from source: facts 30529 1726882705.23149: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.23162: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30529 1726882705.23241: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.23327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.23344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.23361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.23385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.23400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.23429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.23445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.23461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.23484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.23499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.23528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.23543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.23560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.23584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.23598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.23698: variable 'network_connections' from source: include params 30529 1726882705.23708: variable 'interface' from source: play vars 30529 1726882705.23752: variable 'interface' from source: play vars 30529 1726882705.23804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.23914: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.23939: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.23964: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.23984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.24019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.24035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.24055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.24075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.24114: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882705.24268: variable 'network_connections' from source: include params 30529 1726882705.24272: variable 'interface' from source: play vars 30529 1726882705.24320: variable 'interface' from source: play vars 30529 1726882705.24338: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882705.24342: when evaluation is False, skipping this task 30529 1726882705.24344: _execute() done 30529 1726882705.24346: dumping result to json 30529 1726882705.24349: done dumping result, returning 30529 1726882705.24356: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-000000002699] 30529 1726882705.24361: sending task result for task 12673a56-9f93-b0f1-edc0-000000002699 30529 1726882705.24451: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002699 30529 1726882705.24454: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882705.24502: no more pending results, returning what we have 30529 1726882705.24506: results queue empty 30529 1726882705.24507: checking for any_errors_fatal 30529 1726882705.24513: done checking for any_errors_fatal 30529 1726882705.24514: checking for max_fail_percentage 30529 1726882705.24516: done checking for max_fail_percentage 30529 1726882705.24517: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.24518: done checking to see if all hosts have failed 30529 1726882705.24518: getting the remaining hosts for this loop 30529 1726882705.24520: done getting the remaining hosts for this loop 30529 1726882705.24524: getting the next task for host managed_node1 30529 1726882705.24532: done getting next task for host managed_node1 30529 1726882705.24535: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882705.24540: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.24568: getting variables 30529 1726882705.24569: in VariableManager get_vars() 30529 1726882705.24619: Calling all_inventory to load vars for managed_node1 30529 1726882705.24621: Calling groups_inventory to load vars for managed_node1 30529 1726882705.24624: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.24634: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.24636: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.24639: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.25579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.26437: done with get_vars() 30529 1726882705.26453: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30529 1726882705.26507: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:25 -0400 (0:00:00.066) 0:01:59.291 ****** 30529 1726882705.26532: entering _queue_task() for managed_node1/yum 30529 1726882705.26774: worker is 1 (out of 1 available) 30529 1726882705.26789: exiting _queue_task() for managed_node1/yum 30529 1726882705.26804: done queuing things up, now waiting for results queue to drain 30529 1726882705.26806: waiting for pending results... 30529 1726882705.26998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30529 1726882705.27098: in run() - task 12673a56-9f93-b0f1-edc0-00000000269a 30529 1726882705.27120: variable 'ansible_search_path' from source: unknown 30529 1726882705.27124: variable 'ansible_search_path' from source: unknown 30529 1726882705.27154: calling self._execute() 30529 1726882705.27229: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.27232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.27242: variable 'omit' from source: magic vars 30529 1726882705.27516: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.27526: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.27643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.29130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.29181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.29212: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.29240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.29260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.29322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.29343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.29361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.29385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.29399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.29471: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.29483: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30529 1726882705.29486: when evaluation is False, skipping this task 30529 1726882705.29492: _execute() done 30529 1726882705.29496: dumping result to json 30529 1726882705.29498: done dumping result, returning 30529 1726882705.29504: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000269a] 30529 1726882705.29507: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269a 30529 1726882705.29601: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269a 30529 1726882705.29604: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30529 1726882705.29679: no more pending results, returning what we have 30529 1726882705.29683: results queue empty 30529 1726882705.29684: checking for any_errors_fatal 30529 1726882705.29695: done checking for any_errors_fatal 30529 1726882705.29696: checking for max_fail_percentage 30529 1726882705.29698: done checking for max_fail_percentage 30529 1726882705.29699: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.29700: done checking to see if all hosts have failed 30529 1726882705.29700: getting the remaining hosts for this loop 30529 1726882705.29702: done getting the remaining hosts for this loop 30529 1726882705.29705: getting the next task for host managed_node1 30529 1726882705.29714: done getting next task for host managed_node1 30529 1726882705.29718: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882705.29723: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.29749: getting variables 30529 1726882705.29750: in VariableManager get_vars() 30529 1726882705.29797: Calling all_inventory to load vars for managed_node1 30529 1726882705.29800: Calling groups_inventory to load vars for managed_node1 30529 1726882705.29803: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.29812: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.29815: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.29817: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.30632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.35921: done with get_vars() 30529 1726882705.35939: done getting variables 30529 1726882705.35972: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:25 -0400 (0:00:00.094) 0:01:59.386 ****** 30529 1726882705.35997: entering _queue_task() for managed_node1/fail 30529 1726882705.36269: worker is 1 (out of 1 available) 30529 1726882705.36282: exiting _queue_task() for managed_node1/fail 30529 1726882705.36295: done queuing things up, now waiting for results queue to drain 30529 1726882705.36297: waiting for pending results... 30529 1726882705.36486: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30529 1726882705.36604: in run() - task 12673a56-9f93-b0f1-edc0-00000000269b 30529 1726882705.36615: variable 'ansible_search_path' from source: unknown 30529 1726882705.36620: variable 'ansible_search_path' from source: unknown 30529 1726882705.36650: calling self._execute() 30529 1726882705.36726: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.36730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.36739: variable 'omit' from source: magic vars 30529 1726882705.37020: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.37030: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.37119: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.37250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.38744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.38798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.38829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.38856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.38877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.38938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.38958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.38976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.39006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.39018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.39051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.39067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.39084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.39113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.39123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.39154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.39169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.39185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.39213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.39224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.39340: variable 'network_connections' from source: include params 30529 1726882705.39350: variable 'interface' from source: play vars 30529 1726882705.39401: variable 'interface' from source: play vars 30529 1726882705.39449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.39564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.39596: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.39618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.39639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.39669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.39687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.39708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.39725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.39762: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882705.39916: variable 'network_connections' from source: include params 30529 1726882705.39920: variable 'interface' from source: play vars 30529 1726882705.39962: variable 'interface' from source: play vars 30529 1726882705.39980: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882705.39984: when evaluation is False, skipping this task 30529 1726882705.39986: _execute() done 30529 1726882705.39991: dumping result to json 30529 1726882705.39995: done dumping result, returning 30529 1726882705.39999: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000269b] 30529 1726882705.40008: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269b 30529 1726882705.40094: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269b 30529 1726882705.40097: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882705.40157: no more pending results, returning what we have 30529 1726882705.40160: results queue empty 30529 1726882705.40161: checking for any_errors_fatal 30529 1726882705.40169: done checking for any_errors_fatal 30529 1726882705.40170: checking for max_fail_percentage 30529 1726882705.40171: done checking for max_fail_percentage 30529 1726882705.40172: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.40173: done checking to see if all hosts have failed 30529 1726882705.40174: getting the remaining hosts for this loop 30529 1726882705.40177: done getting the remaining hosts for this loop 30529 1726882705.40180: getting the next task for host managed_node1 30529 1726882705.40191: done getting next task for host managed_node1 30529 1726882705.40197: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30529 1726882705.40201: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.40229: getting variables 30529 1726882705.40230: in VariableManager get_vars() 30529 1726882705.40272: Calling all_inventory to load vars for managed_node1 30529 1726882705.40274: Calling groups_inventory to load vars for managed_node1 30529 1726882705.40276: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.40285: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.40290: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.40292: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.41116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.41985: done with get_vars() 30529 1726882705.42004: done getting variables 30529 1726882705.42047: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:25 -0400 (0:00:00.060) 0:01:59.446 ****** 30529 1726882705.42072: entering _queue_task() for managed_node1/package 30529 1726882705.42307: worker is 1 (out of 1 available) 30529 1726882705.42320: exiting _queue_task() for managed_node1/package 30529 1726882705.42332: done queuing things up, now waiting for results queue to drain 30529 1726882705.42334: waiting for pending results... 30529 1726882705.42514: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 30529 1726882705.42620: in run() - task 12673a56-9f93-b0f1-edc0-00000000269c 30529 1726882705.42632: variable 'ansible_search_path' from source: unknown 30529 1726882705.42639: variable 'ansible_search_path' from source: unknown 30529 1726882705.42664: calling self._execute() 30529 1726882705.42737: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.42740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.42748: variable 'omit' from source: magic vars 30529 1726882705.43007: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.43016: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.43143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.43328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.43358: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.43382: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.43441: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.43521: variable 'network_packages' from source: role '' defaults 30529 1726882705.43592: variable '__network_provider_setup' from source: role '' defaults 30529 1726882705.43655: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882705.43658: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882705.43661: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882705.43699: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882705.43807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.45112: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.45155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.45183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.45209: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.45230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.45560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.45579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.45599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.45629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.45639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.45671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.45686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.45707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.45734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.45745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.45878: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882705.45950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.45966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.45982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.46010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.46022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.46083: variable 'ansible_python' from source: facts 30529 1726882705.46097: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882705.46153: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882705.46204: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882705.46282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.46301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.46318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.46341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.46351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.46384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.46406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.46423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.46446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.46457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.46551: variable 'network_connections' from source: include params 30529 1726882705.46554: variable 'interface' from source: play vars 30529 1726882705.46627: variable 'interface' from source: play vars 30529 1726882705.46674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.46692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.46721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.46742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.46778: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.46955: variable 'network_connections' from source: include params 30529 1726882705.46958: variable 'interface' from source: play vars 30529 1726882705.47028: variable 'interface' from source: play vars 30529 1726882705.47050: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882705.47104: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.47296: variable 'network_connections' from source: include params 30529 1726882705.47299: variable 'interface' from source: play vars 30529 1726882705.47343: variable 'interface' from source: play vars 30529 1726882705.47361: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882705.47414: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882705.47604: variable 'network_connections' from source: include params 30529 1726882705.47608: variable 'interface' from source: play vars 30529 1726882705.47653: variable 'interface' from source: play vars 30529 1726882705.47695: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882705.47734: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882705.47740: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882705.47783: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882705.47915: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882705.48199: variable 'network_connections' from source: include params 30529 1726882705.48202: variable 'interface' from source: play vars 30529 1726882705.48246: variable 'interface' from source: play vars 30529 1726882705.48252: variable 'ansible_distribution' from source: facts 30529 1726882705.48255: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.48261: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.48271: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882705.48377: variable 'ansible_distribution' from source: facts 30529 1726882705.48380: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.48384: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.48397: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882705.48501: variable 'ansible_distribution' from source: facts 30529 1726882705.48505: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.48509: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.48534: variable 'network_provider' from source: set_fact 30529 1726882705.48546: variable 'ansible_facts' from source: unknown 30529 1726882705.48908: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30529 1726882705.48912: when evaluation is False, skipping this task 30529 1726882705.48914: _execute() done 30529 1726882705.48917: dumping result to json 30529 1726882705.48919: done dumping result, returning 30529 1726882705.48926: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-b0f1-edc0-00000000269c] 30529 1726882705.48929: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269c 30529 1726882705.49028: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269c 30529 1726882705.49030: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30529 1726882705.49076: no more pending results, returning what we have 30529 1726882705.49079: results queue empty 30529 1726882705.49080: checking for any_errors_fatal 30529 1726882705.49085: done checking for any_errors_fatal 30529 1726882705.49085: checking for max_fail_percentage 30529 1726882705.49090: done checking for max_fail_percentage 30529 1726882705.49091: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.49091: done checking to see if all hosts have failed 30529 1726882705.49092: getting the remaining hosts for this loop 30529 1726882705.49096: done getting the remaining hosts for this loop 30529 1726882705.49099: getting the next task for host managed_node1 30529 1726882705.49108: done getting next task for host managed_node1 30529 1726882705.49112: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882705.49117: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.49144: getting variables 30529 1726882705.49145: in VariableManager get_vars() 30529 1726882705.49199: Calling all_inventory to load vars for managed_node1 30529 1726882705.49202: Calling groups_inventory to load vars for managed_node1 30529 1726882705.49204: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.49213: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.49216: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.49219: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.50196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.51059: done with get_vars() 30529 1726882705.51074: done getting variables 30529 1726882705.51119: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:25 -0400 (0:00:00.090) 0:01:59.537 ****** 30529 1726882705.51146: entering _queue_task() for managed_node1/package 30529 1726882705.51384: worker is 1 (out of 1 available) 30529 1726882705.51402: exiting _queue_task() for managed_node1/package 30529 1726882705.51416: done queuing things up, now waiting for results queue to drain 30529 1726882705.51418: waiting for pending results... 30529 1726882705.51601: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30529 1726882705.51722: in run() - task 12673a56-9f93-b0f1-edc0-00000000269d 30529 1726882705.51733: variable 'ansible_search_path' from source: unknown 30529 1726882705.51738: variable 'ansible_search_path' from source: unknown 30529 1726882705.51766: calling self._execute() 30529 1726882705.51847: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.51851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.51863: variable 'omit' from source: magic vars 30529 1726882705.52142: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.52151: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.52237: variable 'network_state' from source: role '' defaults 30529 1726882705.52244: Evaluated conditional (network_state != {}): False 30529 1726882705.52247: when evaluation is False, skipping this task 30529 1726882705.52249: _execute() done 30529 1726882705.52252: dumping result to json 30529 1726882705.52254: done dumping result, returning 30529 1726882705.52261: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-b0f1-edc0-00000000269d] 30529 1726882705.52266: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269d 30529 1726882705.52354: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269d 30529 1726882705.52357: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882705.52404: no more pending results, returning what we have 30529 1726882705.52408: results queue empty 30529 1726882705.52409: checking for any_errors_fatal 30529 1726882705.52417: done checking for any_errors_fatal 30529 1726882705.52417: checking for max_fail_percentage 30529 1726882705.52419: done checking for max_fail_percentage 30529 1726882705.52420: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.52421: done checking to see if all hosts have failed 30529 1726882705.52422: getting the remaining hosts for this loop 30529 1726882705.52423: done getting the remaining hosts for this loop 30529 1726882705.52427: getting the next task for host managed_node1 30529 1726882705.52435: done getting next task for host managed_node1 30529 1726882705.52438: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882705.52443: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.52471: getting variables 30529 1726882705.52473: in VariableManager get_vars() 30529 1726882705.52516: Calling all_inventory to load vars for managed_node1 30529 1726882705.52519: Calling groups_inventory to load vars for managed_node1 30529 1726882705.52521: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.52531: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.52533: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.52535: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.53290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.54152: done with get_vars() 30529 1726882705.54166: done getting variables 30529 1726882705.54208: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:25 -0400 (0:00:00.030) 0:01:59.568 ****** 30529 1726882705.54233: entering _queue_task() for managed_node1/package 30529 1726882705.54434: worker is 1 (out of 1 available) 30529 1726882705.54449: exiting _queue_task() for managed_node1/package 30529 1726882705.54462: done queuing things up, now waiting for results queue to drain 30529 1726882705.54463: waiting for pending results... 30529 1726882705.54642: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30529 1726882705.54745: in run() - task 12673a56-9f93-b0f1-edc0-00000000269e 30529 1726882705.54757: variable 'ansible_search_path' from source: unknown 30529 1726882705.54761: variable 'ansible_search_path' from source: unknown 30529 1726882705.54787: calling self._execute() 30529 1726882705.54862: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.54866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.54875: variable 'omit' from source: magic vars 30529 1726882705.55136: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.55146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.55232: variable 'network_state' from source: role '' defaults 30529 1726882705.55241: Evaluated conditional (network_state != {}): False 30529 1726882705.55244: when evaluation is False, skipping this task 30529 1726882705.55246: _execute() done 30529 1726882705.55249: dumping result to json 30529 1726882705.55251: done dumping result, returning 30529 1726882705.55259: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-b0f1-edc0-00000000269e] 30529 1726882705.55263: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269e 30529 1726882705.55350: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269e 30529 1726882705.55353: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882705.55397: no more pending results, returning what we have 30529 1726882705.55401: results queue empty 30529 1726882705.55402: checking for any_errors_fatal 30529 1726882705.55409: done checking for any_errors_fatal 30529 1726882705.55409: checking for max_fail_percentage 30529 1726882705.55411: done checking for max_fail_percentage 30529 1726882705.55412: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.55413: done checking to see if all hosts have failed 30529 1726882705.55413: getting the remaining hosts for this loop 30529 1726882705.55415: done getting the remaining hosts for this loop 30529 1726882705.55418: getting the next task for host managed_node1 30529 1726882705.55426: done getting next task for host managed_node1 30529 1726882705.55429: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882705.55434: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.55456: getting variables 30529 1726882705.55458: in VariableManager get_vars() 30529 1726882705.55492: Calling all_inventory to load vars for managed_node1 30529 1726882705.55496: Calling groups_inventory to load vars for managed_node1 30529 1726882705.55498: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.55506: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.55509: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.55511: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.56392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.57241: done with get_vars() 30529 1726882705.57256: done getting variables 30529 1726882705.57299: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:25 -0400 (0:00:00.030) 0:01:59.599 ****** 30529 1726882705.57326: entering _queue_task() for managed_node1/service 30529 1726882705.57545: worker is 1 (out of 1 available) 30529 1726882705.57559: exiting _queue_task() for managed_node1/service 30529 1726882705.57571: done queuing things up, now waiting for results queue to drain 30529 1726882705.57573: waiting for pending results... 30529 1726882705.57755: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30529 1726882705.57859: in run() - task 12673a56-9f93-b0f1-edc0-00000000269f 30529 1726882705.57871: variable 'ansible_search_path' from source: unknown 30529 1726882705.57875: variable 'ansible_search_path' from source: unknown 30529 1726882705.57906: calling self._execute() 30529 1726882705.57979: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.57983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.57995: variable 'omit' from source: magic vars 30529 1726882705.58270: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.58280: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.58368: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.58501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.59977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.60033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.60058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.60084: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.60114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.60168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.60188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.60213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.60239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.60250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.60282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.60303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.60322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.60346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.60356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.60383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.60404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.60424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.60448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.60458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.60574: variable 'network_connections' from source: include params 30529 1726882705.60584: variable 'interface' from source: play vars 30529 1726882705.60636: variable 'interface' from source: play vars 30529 1726882705.60684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.60799: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.60835: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.60857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.60882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.60914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.60929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.60946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.60968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.61010: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882705.61161: variable 'network_connections' from source: include params 30529 1726882705.61166: variable 'interface' from source: play vars 30529 1726882705.61213: variable 'interface' from source: play vars 30529 1726882705.61234: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30529 1726882705.61237: when evaluation is False, skipping this task 30529 1726882705.61240: _execute() done 30529 1726882705.61243: dumping result to json 30529 1726882705.61245: done dumping result, returning 30529 1726882705.61251: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-b0f1-edc0-00000000269f] 30529 1726882705.61256: sending task result for task 12673a56-9f93-b0f1-edc0-00000000269f 30529 1726882705.61343: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000269f 30529 1726882705.61353: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30529 1726882705.61400: no more pending results, returning what we have 30529 1726882705.61403: results queue empty 30529 1726882705.61404: checking for any_errors_fatal 30529 1726882705.61410: done checking for any_errors_fatal 30529 1726882705.61410: checking for max_fail_percentage 30529 1726882705.61412: done checking for max_fail_percentage 30529 1726882705.61413: checking to see if all hosts have failed and the running result is not ok 30529 1726882705.61414: done checking to see if all hosts have failed 30529 1726882705.61414: getting the remaining hosts for this loop 30529 1726882705.61416: done getting the remaining hosts for this loop 30529 1726882705.61420: getting the next task for host managed_node1 30529 1726882705.61428: done getting next task for host managed_node1 30529 1726882705.61432: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882705.61437: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882705.61463: getting variables 30529 1726882705.61465: in VariableManager get_vars() 30529 1726882705.61510: Calling all_inventory to load vars for managed_node1 30529 1726882705.61512: Calling groups_inventory to load vars for managed_node1 30529 1726882705.61514: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882705.61523: Calling all_plugins_play to load vars for managed_node1 30529 1726882705.61526: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882705.61528: Calling groups_plugins_play to load vars for managed_node1 30529 1726882705.62349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882705.63222: done with get_vars() 30529 1726882705.63237: done getting variables 30529 1726882705.63277: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:25 -0400 (0:00:00.059) 0:01:59.659 ****** 30529 1726882705.63305: entering _queue_task() for managed_node1/service 30529 1726882705.63530: worker is 1 (out of 1 available) 30529 1726882705.63541: exiting _queue_task() for managed_node1/service 30529 1726882705.63553: done queuing things up, now waiting for results queue to drain 30529 1726882705.63555: waiting for pending results... 30529 1726882705.63736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30529 1726882705.63838: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a0 30529 1726882705.63851: variable 'ansible_search_path' from source: unknown 30529 1726882705.63855: variable 'ansible_search_path' from source: unknown 30529 1726882705.63880: calling self._execute() 30529 1726882705.63955: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.63960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.63967: variable 'omit' from source: magic vars 30529 1726882705.64238: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.64247: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882705.64362: variable 'network_provider' from source: set_fact 30529 1726882705.64366: variable 'network_state' from source: role '' defaults 30529 1726882705.64375: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30529 1726882705.64381: variable 'omit' from source: magic vars 30529 1726882705.64425: variable 'omit' from source: magic vars 30529 1726882705.64446: variable 'network_service_name' from source: role '' defaults 30529 1726882705.64495: variable 'network_service_name' from source: role '' defaults 30529 1726882705.64565: variable '__network_provider_setup' from source: role '' defaults 30529 1726882705.64568: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882705.64615: variable '__network_service_name_default_nm' from source: role '' defaults 30529 1726882705.64623: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882705.64667: variable '__network_packages_default_nm' from source: role '' defaults 30529 1726882705.64816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882705.66548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882705.66603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882705.66631: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882705.66656: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882705.66675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882705.66738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.66758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.66775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.66805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.66816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.66850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.66866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.66883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.66911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.66921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.67070: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30529 1726882705.67149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.67166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.67182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.67209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.67220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.67282: variable 'ansible_python' from source: facts 30529 1726882705.67297: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30529 1726882705.67351: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882705.67406: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882705.67484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.67509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.67525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.67549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.67559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.67597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882705.67619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882705.67635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.67659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882705.67669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882705.67764: variable 'network_connections' from source: include params 30529 1726882705.67770: variable 'interface' from source: play vars 30529 1726882705.67828: variable 'interface' from source: play vars 30529 1726882705.67898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882705.68030: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882705.68065: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882705.68099: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882705.68129: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882705.68171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882705.68192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882705.68217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882705.68240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882705.68277: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.68458: variable 'network_connections' from source: include params 30529 1726882705.68467: variable 'interface' from source: play vars 30529 1726882705.68519: variable 'interface' from source: play vars 30529 1726882705.68542: variable '__network_packages_default_wireless' from source: role '' defaults 30529 1726882705.68598: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882705.68780: variable 'network_connections' from source: include params 30529 1726882705.68783: variable 'interface' from source: play vars 30529 1726882705.68836: variable 'interface' from source: play vars 30529 1726882705.68852: variable '__network_packages_default_team' from source: role '' defaults 30529 1726882705.68907: variable '__network_team_connections_defined' from source: role '' defaults 30529 1726882705.69083: variable 'network_connections' from source: include params 30529 1726882705.69086: variable 'interface' from source: play vars 30529 1726882705.69138: variable 'interface' from source: play vars 30529 1726882705.69174: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882705.69217: variable '__network_service_name_default_initscripts' from source: role '' defaults 30529 1726882705.69222: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882705.69265: variable '__network_packages_default_initscripts' from source: role '' defaults 30529 1726882705.69396: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30529 1726882705.69686: variable 'network_connections' from source: include params 30529 1726882705.69692: variable 'interface' from source: play vars 30529 1726882705.69733: variable 'interface' from source: play vars 30529 1726882705.69740: variable 'ansible_distribution' from source: facts 30529 1726882705.69742: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.69749: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.69759: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30529 1726882705.69870: variable 'ansible_distribution' from source: facts 30529 1726882705.69873: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.69884: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.69894: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30529 1726882705.69999: variable 'ansible_distribution' from source: facts 30529 1726882705.70002: variable '__network_rh_distros' from source: role '' defaults 30529 1726882705.70008: variable 'ansible_distribution_major_version' from source: facts 30529 1726882705.70034: variable 'network_provider' from source: set_fact 30529 1726882705.70051: variable 'omit' from source: magic vars 30529 1726882705.70072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882705.70095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882705.70112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882705.70125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882705.70135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882705.70156: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882705.70159: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.70162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.70233: Set connection var ansible_shell_executable to /bin/sh 30529 1726882705.70236: Set connection var ansible_pipelining to False 30529 1726882705.70239: Set connection var ansible_shell_type to sh 30529 1726882705.70247: Set connection var ansible_timeout to 10 30529 1726882705.70249: Set connection var ansible_connection to ssh 30529 1726882705.70254: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882705.70272: variable 'ansible_shell_executable' from source: unknown 30529 1726882705.70274: variable 'ansible_connection' from source: unknown 30529 1726882705.70277: variable 'ansible_module_compression' from source: unknown 30529 1726882705.70279: variable 'ansible_shell_type' from source: unknown 30529 1726882705.70281: variable 'ansible_shell_executable' from source: unknown 30529 1726882705.70283: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882705.70290: variable 'ansible_pipelining' from source: unknown 30529 1726882705.70292: variable 'ansible_timeout' from source: unknown 30529 1726882705.70296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882705.70363: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882705.70371: variable 'omit' from source: magic vars 30529 1726882705.70377: starting attempt loop 30529 1726882705.70380: running the handler 30529 1726882705.70437: variable 'ansible_facts' from source: unknown 30529 1726882705.70836: _low_level_execute_command(): starting 30529 1726882705.70842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882705.71343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.71347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.71349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882705.71352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882705.71354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.71399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882705.71404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882705.71422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882705.71472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882705.73164: stdout chunk (state=3): >>>/root <<< 30529 1726882705.73264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882705.73294: stderr chunk (state=3): >>><<< 30529 1726882705.73298: stdout chunk (state=3): >>><<< 30529 1726882705.73312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882705.73322: _low_level_execute_command(): starting 30529 1726882705.73327: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073 `" && echo ansible-tmp-1726882705.7331214-35779-159184702669073="` echo /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073 `" ) && sleep 0' 30529 1726882705.73751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.73754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882705.73757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882705.73759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.73761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.73812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882705.73815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882705.73864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882705.75746: stdout chunk (state=3): >>>ansible-tmp-1726882705.7331214-35779-159184702669073=/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073 <<< 30529 1726882705.75855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882705.75878: stderr chunk (state=3): >>><<< 30529 1726882705.75882: stdout chunk (state=3): >>><<< 30529 1726882705.75898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882705.7331214-35779-159184702669073=/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882705.75925: variable 'ansible_module_compression' from source: unknown 30529 1726882705.75961: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30529 1726882705.76012: variable 'ansible_facts' from source: unknown 30529 1726882705.76150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py 30529 1726882705.76245: Sending initial data 30529 1726882705.76249: Sent initial data (156 bytes) 30529 1726882705.76679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.76682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882705.76690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882705.76694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.76696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.76741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882705.76744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882705.76792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882705.78342: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30529 1726882705.78349: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882705.78381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882705.78424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjy4cnzp0 /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py <<< 30529 1726882705.78428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py" <<< 30529 1726882705.78477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpjy4cnzp0" to remote "/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py" <<< 30529 1726882705.79503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882705.79539: stderr chunk (state=3): >>><<< 30529 1726882705.79542: stdout chunk (state=3): >>><<< 30529 1726882705.79573: done transferring module to remote 30529 1726882705.79581: _low_level_execute_command(): starting 30529 1726882705.79586: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/ /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py && sleep 0' 30529 1726882705.79997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882705.80000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882705.80003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.80005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.80007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.80063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882705.80066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882705.80107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882705.81872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882705.81896: stderr chunk (state=3): >>><<< 30529 1726882705.81899: stdout chunk (state=3): >>><<< 30529 1726882705.81910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882705.81914: _low_level_execute_command(): starting 30529 1726882705.81916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/AnsiballZ_systemd.py && sleep 0' 30529 1726882705.82310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.82313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.82315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882705.82317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882705.82364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882705.82367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882705.82419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.11478: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10776576", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298480128", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1994408000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30529 1726882706.11485: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 30529 1726882706.11503: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30529 1726882706.13324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882706.13356: stderr chunk (state=3): >>><<< 30529 1726882706.13359: stdout chunk (state=3): >>><<< 30529 1726882706.13378: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10776576", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298480128", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1994408000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882706.13498: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882706.13515: _low_level_execute_command(): starting 30529 1726882706.13520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882705.7331214-35779-159184702669073/ > /dev/null 2>&1 && sleep 0' 30529 1726882706.13960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.13963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882706.13966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.13968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.13970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.14026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.14035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.14038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.14073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.15847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.15871: stderr chunk (state=3): >>><<< 30529 1726882706.15874: stdout chunk (state=3): >>><<< 30529 1726882706.15885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.15894: handler run complete 30529 1726882706.15936: attempt loop complete, returning result 30529 1726882706.15939: _execute() done 30529 1726882706.15942: dumping result to json 30529 1726882706.15953: done dumping result, returning 30529 1726882706.15961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-b0f1-edc0-0000000026a0] 30529 1726882706.15967: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a0 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882706.16506: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a0 30529 1726882706.16509: WORKER PROCESS EXITING 30529 1726882706.16523: no more pending results, returning what we have 30529 1726882706.16525: results queue empty 30529 1726882706.16526: checking for any_errors_fatal 30529 1726882706.16528: done checking for any_errors_fatal 30529 1726882706.16529: checking for max_fail_percentage 30529 1726882706.16530: done checking for max_fail_percentage 30529 1726882706.16530: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.16531: done checking to see if all hosts have failed 30529 1726882706.16531: getting the remaining hosts for this loop 30529 1726882706.16532: done getting the remaining hosts for this loop 30529 1726882706.16535: getting the next task for host managed_node1 30529 1726882706.16539: done getting next task for host managed_node1 30529 1726882706.16541: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882706.16546: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.16553: getting variables 30529 1726882706.16555: in VariableManager get_vars() 30529 1726882706.16578: Calling all_inventory to load vars for managed_node1 30529 1726882706.16579: Calling groups_inventory to load vars for managed_node1 30529 1726882706.16581: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.16587: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.16591: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.16595: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.17266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.18121: done with get_vars() 30529 1726882706.18138: done getting variables 30529 1726882706.18179: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:26 -0400 (0:00:00.549) 0:02:00.208 ****** 30529 1726882706.18210: entering _queue_task() for managed_node1/service 30529 1726882706.18438: worker is 1 (out of 1 available) 30529 1726882706.18451: exiting _queue_task() for managed_node1/service 30529 1726882706.18463: done queuing things up, now waiting for results queue to drain 30529 1726882706.18464: waiting for pending results... 30529 1726882706.18648: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30529 1726882706.18754: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a1 30529 1726882706.18766: variable 'ansible_search_path' from source: unknown 30529 1726882706.18770: variable 'ansible_search_path' from source: unknown 30529 1726882706.18802: calling self._execute() 30529 1726882706.18869: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.18872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.18880: variable 'omit' from source: magic vars 30529 1726882706.19159: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.19168: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.19255: variable 'network_provider' from source: set_fact 30529 1726882706.19259: Evaluated conditional (network_provider == "nm"): True 30529 1726882706.19324: variable '__network_wpa_supplicant_required' from source: role '' defaults 30529 1726882706.19386: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30529 1726882706.19501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882706.20925: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882706.20971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882706.21002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882706.21028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882706.21047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882706.21240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882706.21261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882706.21278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882706.21309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882706.21320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882706.21353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882706.21368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882706.21384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882706.21417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882706.21423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882706.21451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882706.21467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882706.21482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882706.21510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882706.21524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882706.21619: variable 'network_connections' from source: include params 30529 1726882706.21633: variable 'interface' from source: play vars 30529 1726882706.21676: variable 'interface' from source: play vars 30529 1726882706.21727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30529 1726882706.21835: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30529 1726882706.21863: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30529 1726882706.21884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30529 1726882706.21909: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30529 1726882706.21938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30529 1726882706.21954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30529 1726882706.21974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882706.21994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30529 1726882706.22030: variable '__network_wireless_connections_defined' from source: role '' defaults 30529 1726882706.22179: variable 'network_connections' from source: include params 30529 1726882706.22189: variable 'interface' from source: play vars 30529 1726882706.22235: variable 'interface' from source: play vars 30529 1726882706.22258: Evaluated conditional (__network_wpa_supplicant_required): False 30529 1726882706.22261: when evaluation is False, skipping this task 30529 1726882706.22263: _execute() done 30529 1726882706.22266: dumping result to json 30529 1726882706.22268: done dumping result, returning 30529 1726882706.22275: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-b0f1-edc0-0000000026a1] 30529 1726882706.22287: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a1 30529 1726882706.22373: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a1 30529 1726882706.22376: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30529 1726882706.22445: no more pending results, returning what we have 30529 1726882706.22448: results queue empty 30529 1726882706.22449: checking for any_errors_fatal 30529 1726882706.22469: done checking for any_errors_fatal 30529 1726882706.22470: checking for max_fail_percentage 30529 1726882706.22472: done checking for max_fail_percentage 30529 1726882706.22473: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.22474: done checking to see if all hosts have failed 30529 1726882706.22475: getting the remaining hosts for this loop 30529 1726882706.22477: done getting the remaining hosts for this loop 30529 1726882706.22481: getting the next task for host managed_node1 30529 1726882706.22489: done getting next task for host managed_node1 30529 1726882706.22495: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882706.22500: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.22527: getting variables 30529 1726882706.22529: in VariableManager get_vars() 30529 1726882706.22568: Calling all_inventory to load vars for managed_node1 30529 1726882706.22571: Calling groups_inventory to load vars for managed_node1 30529 1726882706.22573: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.22582: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.22585: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.22587: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.23482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.24350: done with get_vars() 30529 1726882706.24366: done getting variables 30529 1726882706.24410: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:26 -0400 (0:00:00.062) 0:02:00.270 ****** 30529 1726882706.24436: entering _queue_task() for managed_node1/service 30529 1726882706.24677: worker is 1 (out of 1 available) 30529 1726882706.24691: exiting _queue_task() for managed_node1/service 30529 1726882706.24706: done queuing things up, now waiting for results queue to drain 30529 1726882706.24708: waiting for pending results... 30529 1726882706.24895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 30529 1726882706.25004: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a2 30529 1726882706.25016: variable 'ansible_search_path' from source: unknown 30529 1726882706.25019: variable 'ansible_search_path' from source: unknown 30529 1726882706.25048: calling self._execute() 30529 1726882706.25122: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.25126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.25134: variable 'omit' from source: magic vars 30529 1726882706.25414: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.25424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.25507: variable 'network_provider' from source: set_fact 30529 1726882706.25510: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882706.25515: when evaluation is False, skipping this task 30529 1726882706.25517: _execute() done 30529 1726882706.25520: dumping result to json 30529 1726882706.25524: done dumping result, returning 30529 1726882706.25531: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-b0f1-edc0-0000000026a2] 30529 1726882706.25536: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a2 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30529 1726882706.25665: no more pending results, returning what we have 30529 1726882706.25669: results queue empty 30529 1726882706.25670: checking for any_errors_fatal 30529 1726882706.25679: done checking for any_errors_fatal 30529 1726882706.25679: checking for max_fail_percentage 30529 1726882706.25681: done checking for max_fail_percentage 30529 1726882706.25682: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.25683: done checking to see if all hosts have failed 30529 1726882706.25683: getting the remaining hosts for this loop 30529 1726882706.25685: done getting the remaining hosts for this loop 30529 1726882706.25689: getting the next task for host managed_node1 30529 1726882706.25699: done getting next task for host managed_node1 30529 1726882706.25704: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882706.25709: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.25733: getting variables 30529 1726882706.25734: in VariableManager get_vars() 30529 1726882706.25772: Calling all_inventory to load vars for managed_node1 30529 1726882706.25774: Calling groups_inventory to load vars for managed_node1 30529 1726882706.25776: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.25786: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.25789: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.25792: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.25806: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a2 30529 1726882706.25809: WORKER PROCESS EXITING 30529 1726882706.26562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.27427: done with get_vars() 30529 1726882706.27442: done getting variables 30529 1726882706.27482: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:26 -0400 (0:00:00.030) 0:02:00.301 ****** 30529 1726882706.27511: entering _queue_task() for managed_node1/copy 30529 1726882706.27741: worker is 1 (out of 1 available) 30529 1726882706.27755: exiting _queue_task() for managed_node1/copy 30529 1726882706.27768: done queuing things up, now waiting for results queue to drain 30529 1726882706.27770: waiting for pending results... 30529 1726882706.27958: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30529 1726882706.28065: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a3 30529 1726882706.28076: variable 'ansible_search_path' from source: unknown 30529 1726882706.28079: variable 'ansible_search_path' from source: unknown 30529 1726882706.28113: calling self._execute() 30529 1726882706.28182: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.28186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.28198: variable 'omit' from source: magic vars 30529 1726882706.28473: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.28482: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.28566: variable 'network_provider' from source: set_fact 30529 1726882706.28570: Evaluated conditional (network_provider == "initscripts"): False 30529 1726882706.28573: when evaluation is False, skipping this task 30529 1726882706.28576: _execute() done 30529 1726882706.28579: dumping result to json 30529 1726882706.28583: done dumping result, returning 30529 1726882706.28590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-b0f1-edc0-0000000026a3] 30529 1726882706.28598: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a3 30529 1726882706.28687: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a3 30529 1726882706.28689: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30529 1726882706.28739: no more pending results, returning what we have 30529 1726882706.28743: results queue empty 30529 1726882706.28744: checking for any_errors_fatal 30529 1726882706.28750: done checking for any_errors_fatal 30529 1726882706.28751: checking for max_fail_percentage 30529 1726882706.28752: done checking for max_fail_percentage 30529 1726882706.28753: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.28754: done checking to see if all hosts have failed 30529 1726882706.28755: getting the remaining hosts for this loop 30529 1726882706.28756: done getting the remaining hosts for this loop 30529 1726882706.28760: getting the next task for host managed_node1 30529 1726882706.28767: done getting next task for host managed_node1 30529 1726882706.28770: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882706.28775: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.28799: getting variables 30529 1726882706.28801: in VariableManager get_vars() 30529 1726882706.28840: Calling all_inventory to load vars for managed_node1 30529 1726882706.28842: Calling groups_inventory to load vars for managed_node1 30529 1726882706.28844: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.28852: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.28855: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.28857: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.29729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.30565: done with get_vars() 30529 1726882706.30580: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:26 -0400 (0:00:00.031) 0:02:00.332 ****** 30529 1726882706.30642: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882706.30860: worker is 1 (out of 1 available) 30529 1726882706.30875: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 30529 1726882706.30889: done queuing things up, now waiting for results queue to drain 30529 1726882706.30890: waiting for pending results... 30529 1726882706.31072: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30529 1726882706.31174: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a4 30529 1726882706.31187: variable 'ansible_search_path' from source: unknown 30529 1726882706.31190: variable 'ansible_search_path' from source: unknown 30529 1726882706.31229: calling self._execute() 30529 1726882706.31302: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.31306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.31314: variable 'omit' from source: magic vars 30529 1726882706.31584: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.31599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.31604: variable 'omit' from source: magic vars 30529 1726882706.31646: variable 'omit' from source: magic vars 30529 1726882706.31754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882706.33198: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882706.33242: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882706.33268: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882706.33301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882706.33321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882706.33379: variable 'network_provider' from source: set_fact 30529 1726882706.33471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882706.33490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882706.33513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882706.33539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882706.33549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882706.33606: variable 'omit' from source: magic vars 30529 1726882706.33678: variable 'omit' from source: magic vars 30529 1726882706.33750: variable 'network_connections' from source: include params 30529 1726882706.33760: variable 'interface' from source: play vars 30529 1726882706.33806: variable 'interface' from source: play vars 30529 1726882706.33910: variable 'omit' from source: magic vars 30529 1726882706.33916: variable '__lsr_ansible_managed' from source: task vars 30529 1726882706.33959: variable '__lsr_ansible_managed' from source: task vars 30529 1726882706.34087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30529 1726882706.34226: Loaded config def from plugin (lookup/template) 30529 1726882706.34230: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30529 1726882706.34249: File lookup term: get_ansible_managed.j2 30529 1726882706.34252: variable 'ansible_search_path' from source: unknown 30529 1726882706.34255: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30529 1726882706.34272: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30529 1726882706.34283: variable 'ansible_search_path' from source: unknown 30529 1726882706.37704: variable 'ansible_managed' from source: unknown 30529 1726882706.37778: variable 'omit' from source: magic vars 30529 1726882706.37803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882706.37821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882706.37837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882706.37850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.37859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.37879: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882706.37882: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.37885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.37955: Set connection var ansible_shell_executable to /bin/sh 30529 1726882706.37958: Set connection var ansible_pipelining to False 30529 1726882706.37960: Set connection var ansible_shell_type to sh 30529 1726882706.37966: Set connection var ansible_timeout to 10 30529 1726882706.37969: Set connection var ansible_connection to ssh 30529 1726882706.37973: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882706.37990: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.37996: variable 'ansible_connection' from source: unknown 30529 1726882706.37998: variable 'ansible_module_compression' from source: unknown 30529 1726882706.38002: variable 'ansible_shell_type' from source: unknown 30529 1726882706.38005: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.38008: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.38010: variable 'ansible_pipelining' from source: unknown 30529 1726882706.38014: variable 'ansible_timeout' from source: unknown 30529 1726882706.38018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.38109: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882706.38121: variable 'omit' from source: magic vars 30529 1726882706.38124: starting attempt loop 30529 1726882706.38127: running the handler 30529 1726882706.38137: _low_level_execute_command(): starting 30529 1726882706.38143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882706.38638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.38642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.38645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.38647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.38699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.38702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.38711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.38754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.40332: stdout chunk (state=3): >>>/root <<< 30529 1726882706.40428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.40456: stderr chunk (state=3): >>><<< 30529 1726882706.40463: stdout chunk (state=3): >>><<< 30529 1726882706.40481: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.40492: _low_level_execute_command(): starting 30529 1726882706.40498: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051 `" && echo ansible-tmp-1726882706.4048152-35793-89524172912051="` echo /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051 `" ) && sleep 0' 30529 1726882706.40936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.40940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882706.40943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.40945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.40947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882706.40948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.40991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.40998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.41050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.42903: stdout chunk (state=3): >>>ansible-tmp-1726882706.4048152-35793-89524172912051=/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051 <<< 30529 1726882706.43002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.43029: stderr chunk (state=3): >>><<< 30529 1726882706.43032: stdout chunk (state=3): >>><<< 30529 1726882706.43050: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882706.4048152-35793-89524172912051=/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.43087: variable 'ansible_module_compression' from source: unknown 30529 1726882706.43125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30529 1726882706.43167: variable 'ansible_facts' from source: unknown 30529 1726882706.43260: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py 30529 1726882706.43357: Sending initial data 30529 1726882706.43360: Sent initial data (167 bytes) 30529 1726882706.43812: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.43815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882706.43822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.43824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882706.43826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882706.43828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.43874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.43877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.43881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.43921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.45449: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882706.45486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882706.45528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqs__dnoe /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py <<< 30529 1726882706.45536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py" <<< 30529 1726882706.45573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpqs__dnoe" to remote "/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py" <<< 30529 1726882706.45579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py" <<< 30529 1726882706.46286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.46333: stderr chunk (state=3): >>><<< 30529 1726882706.46336: stdout chunk (state=3): >>><<< 30529 1726882706.46358: done transferring module to remote 30529 1726882706.46367: _low_level_execute_command(): starting 30529 1726882706.46372: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/ /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py && sleep 0' 30529 1726882706.46831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.46834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882706.46836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.46839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.46841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.46903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.46906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.46907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.46944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.48644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.48675: stderr chunk (state=3): >>><<< 30529 1726882706.48678: stdout chunk (state=3): >>><<< 30529 1726882706.48694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.48698: _low_level_execute_command(): starting 30529 1726882706.48700: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/AnsiballZ_network_connections.py && sleep 0' 30529 1726882706.49139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882706.49142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.49144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.49146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.49197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.49201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.49256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.74815: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30529 1726882706.76432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882706.76460: stderr chunk (state=3): >>><<< 30529 1726882706.76465: stdout chunk (state=3): >>><<< 30529 1726882706.76480: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882706.76515: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882706.76523: _low_level_execute_command(): starting 30529 1726882706.76528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882706.4048152-35793-89524172912051/ > /dev/null 2>&1 && sleep 0' 30529 1726882706.76971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882706.76975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882706.76977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.76979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.76983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.77036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.77043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.77046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.77086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.78879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.78908: stderr chunk (state=3): >>><<< 30529 1726882706.78912: stdout chunk (state=3): >>><<< 30529 1726882706.78925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.78930: handler run complete 30529 1726882706.78949: attempt loop complete, returning result 30529 1726882706.78952: _execute() done 30529 1726882706.78954: dumping result to json 30529 1726882706.78959: done dumping result, returning 30529 1726882706.78967: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-b0f1-edc0-0000000026a4] 30529 1726882706.78975: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a4 ok: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30529 1726882706.79172: no more pending results, returning what we have 30529 1726882706.79175: results queue empty 30529 1726882706.79176: checking for any_errors_fatal 30529 1726882706.79182: done checking for any_errors_fatal 30529 1726882706.79183: checking for max_fail_percentage 30529 1726882706.79184: done checking for max_fail_percentage 30529 1726882706.79185: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.79186: done checking to see if all hosts have failed 30529 1726882706.79187: getting the remaining hosts for this loop 30529 1726882706.79191: done getting the remaining hosts for this loop 30529 1726882706.79196: getting the next task for host managed_node1 30529 1726882706.79204: done getting next task for host managed_node1 30529 1726882706.79207: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882706.79212: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.79226: getting variables 30529 1726882706.79228: in VariableManager get_vars() 30529 1726882706.79269: Calling all_inventory to load vars for managed_node1 30529 1726882706.79271: Calling groups_inventory to load vars for managed_node1 30529 1726882706.79274: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.79284: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.79289: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.79292: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.79306: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a4 30529 1726882706.79309: WORKER PROCESS EXITING 30529 1726882706.80155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.81146: done with get_vars() 30529 1726882706.81162: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:26 -0400 (0:00:00.505) 0:02:00.838 ****** 30529 1726882706.81228: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882706.81471: worker is 1 (out of 1 available) 30529 1726882706.81484: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 30529 1726882706.81503: done queuing things up, now waiting for results queue to drain 30529 1726882706.81504: waiting for pending results... 30529 1726882706.81685: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 30529 1726882706.81797: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a5 30529 1726882706.81808: variable 'ansible_search_path' from source: unknown 30529 1726882706.81812: variable 'ansible_search_path' from source: unknown 30529 1726882706.81842: calling self._execute() 30529 1726882706.81913: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.81917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.81926: variable 'omit' from source: magic vars 30529 1726882706.82200: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.82210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.82294: variable 'network_state' from source: role '' defaults 30529 1726882706.82302: Evaluated conditional (network_state != {}): False 30529 1726882706.82305: when evaluation is False, skipping this task 30529 1726882706.82307: _execute() done 30529 1726882706.82311: dumping result to json 30529 1726882706.82314: done dumping result, returning 30529 1726882706.82320: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-b0f1-edc0-0000000026a5] 30529 1726882706.82324: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a5 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30529 1726882706.82463: no more pending results, returning what we have 30529 1726882706.82467: results queue empty 30529 1726882706.82468: checking for any_errors_fatal 30529 1726882706.82479: done checking for any_errors_fatal 30529 1726882706.82479: checking for max_fail_percentage 30529 1726882706.82481: done checking for max_fail_percentage 30529 1726882706.82482: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.82482: done checking to see if all hosts have failed 30529 1726882706.82483: getting the remaining hosts for this loop 30529 1726882706.82485: done getting the remaining hosts for this loop 30529 1726882706.82490: getting the next task for host managed_node1 30529 1726882706.82504: done getting next task for host managed_node1 30529 1726882706.82508: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882706.82512: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.82534: getting variables 30529 1726882706.82536: in VariableManager get_vars() 30529 1726882706.82572: Calling all_inventory to load vars for managed_node1 30529 1726882706.82574: Calling groups_inventory to load vars for managed_node1 30529 1726882706.82577: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.82585: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.82590: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.82598: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.82607: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a5 30529 1726882706.82610: WORKER PROCESS EXITING 30529 1726882706.83370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.84245: done with get_vars() 30529 1726882706.84264: done getting variables 30529 1726882706.84312: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:26 -0400 (0:00:00.031) 0:02:00.869 ****** 30529 1726882706.84340: entering _queue_task() for managed_node1/debug 30529 1726882706.84601: worker is 1 (out of 1 available) 30529 1726882706.84617: exiting _queue_task() for managed_node1/debug 30529 1726882706.84631: done queuing things up, now waiting for results queue to drain 30529 1726882706.84633: waiting for pending results... 30529 1726882706.84830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30529 1726882706.84927: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a6 30529 1726882706.84939: variable 'ansible_search_path' from source: unknown 30529 1726882706.84943: variable 'ansible_search_path' from source: unknown 30529 1726882706.84974: calling self._execute() 30529 1726882706.85051: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.85054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.85063: variable 'omit' from source: magic vars 30529 1726882706.85348: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.85357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.85364: variable 'omit' from source: magic vars 30529 1726882706.85416: variable 'omit' from source: magic vars 30529 1726882706.85439: variable 'omit' from source: magic vars 30529 1726882706.85470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882706.85499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882706.85516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882706.85529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.85540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.85562: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882706.85565: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.85567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.85642: Set connection var ansible_shell_executable to /bin/sh 30529 1726882706.85646: Set connection var ansible_pipelining to False 30529 1726882706.85648: Set connection var ansible_shell_type to sh 30529 1726882706.85656: Set connection var ansible_timeout to 10 30529 1726882706.85658: Set connection var ansible_connection to ssh 30529 1726882706.85663: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882706.85680: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.85683: variable 'ansible_connection' from source: unknown 30529 1726882706.85685: variable 'ansible_module_compression' from source: unknown 30529 1726882706.85690: variable 'ansible_shell_type' from source: unknown 30529 1726882706.85694: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.85697: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.85699: variable 'ansible_pipelining' from source: unknown 30529 1726882706.85702: variable 'ansible_timeout' from source: unknown 30529 1726882706.85704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.85800: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882706.85809: variable 'omit' from source: magic vars 30529 1726882706.85814: starting attempt loop 30529 1726882706.85817: running the handler 30529 1726882706.85911: variable '__network_connections_result' from source: set_fact 30529 1726882706.85951: handler run complete 30529 1726882706.85963: attempt loop complete, returning result 30529 1726882706.85966: _execute() done 30529 1726882706.85969: dumping result to json 30529 1726882706.85971: done dumping result, returning 30529 1726882706.85980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000026a6] 30529 1726882706.85982: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a6 30529 1726882706.86068: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a6 30529 1726882706.86071: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30529 1726882706.86144: no more pending results, returning what we have 30529 1726882706.86147: results queue empty 30529 1726882706.86148: checking for any_errors_fatal 30529 1726882706.86155: done checking for any_errors_fatal 30529 1726882706.86156: checking for max_fail_percentage 30529 1726882706.86158: done checking for max_fail_percentage 30529 1726882706.86158: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.86159: done checking to see if all hosts have failed 30529 1726882706.86160: getting the remaining hosts for this loop 30529 1726882706.86162: done getting the remaining hosts for this loop 30529 1726882706.86165: getting the next task for host managed_node1 30529 1726882706.86173: done getting next task for host managed_node1 30529 1726882706.86177: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882706.86182: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.86199: getting variables 30529 1726882706.86201: in VariableManager get_vars() 30529 1726882706.86239: Calling all_inventory to load vars for managed_node1 30529 1726882706.86241: Calling groups_inventory to load vars for managed_node1 30529 1726882706.86244: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.86255: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.86258: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.86260: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.87245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.88091: done with get_vars() 30529 1726882706.88108: done getting variables 30529 1726882706.88150: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:26 -0400 (0:00:00.038) 0:02:00.907 ****** 30529 1726882706.88178: entering _queue_task() for managed_node1/debug 30529 1726882706.88416: worker is 1 (out of 1 available) 30529 1726882706.88430: exiting _queue_task() for managed_node1/debug 30529 1726882706.88442: done queuing things up, now waiting for results queue to drain 30529 1726882706.88444: waiting for pending results... 30529 1726882706.88629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30529 1726882706.88724: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a7 30529 1726882706.88737: variable 'ansible_search_path' from source: unknown 30529 1726882706.88740: variable 'ansible_search_path' from source: unknown 30529 1726882706.88767: calling self._execute() 30529 1726882706.88845: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.88849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.88859: variable 'omit' from source: magic vars 30529 1726882706.89131: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.89141: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.89148: variable 'omit' from source: magic vars 30529 1726882706.89192: variable 'omit' from source: magic vars 30529 1726882706.89216: variable 'omit' from source: magic vars 30529 1726882706.89248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882706.89275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882706.89295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882706.89307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.89322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.89343: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882706.89346: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.89349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.89419: Set connection var ansible_shell_executable to /bin/sh 30529 1726882706.89422: Set connection var ansible_pipelining to False 30529 1726882706.89426: Set connection var ansible_shell_type to sh 30529 1726882706.89434: Set connection var ansible_timeout to 10 30529 1726882706.89438: Set connection var ansible_connection to ssh 30529 1726882706.89441: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882706.89459: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.89462: variable 'ansible_connection' from source: unknown 30529 1726882706.89465: variable 'ansible_module_compression' from source: unknown 30529 1726882706.89467: variable 'ansible_shell_type' from source: unknown 30529 1726882706.89469: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.89471: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.89473: variable 'ansible_pipelining' from source: unknown 30529 1726882706.89475: variable 'ansible_timeout' from source: unknown 30529 1726882706.89480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.89580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882706.89799: variable 'omit' from source: magic vars 30529 1726882706.89802: starting attempt loop 30529 1726882706.89805: running the handler 30529 1726882706.89807: variable '__network_connections_result' from source: set_fact 30529 1726882706.89809: variable '__network_connections_result' from source: set_fact 30529 1726882706.89837: handler run complete 30529 1726882706.89866: attempt loop complete, returning result 30529 1726882706.89874: _execute() done 30529 1726882706.89881: dumping result to json 30529 1726882706.89889: done dumping result, returning 30529 1726882706.89906: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-b0f1-edc0-0000000026a7] 30529 1726882706.89916: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a7 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30529 1726882706.90117: no more pending results, returning what we have 30529 1726882706.90121: results queue empty 30529 1726882706.90122: checking for any_errors_fatal 30529 1726882706.90129: done checking for any_errors_fatal 30529 1726882706.90130: checking for max_fail_percentage 30529 1726882706.90132: done checking for max_fail_percentage 30529 1726882706.90133: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.90134: done checking to see if all hosts have failed 30529 1726882706.90134: getting the remaining hosts for this loop 30529 1726882706.90136: done getting the remaining hosts for this loop 30529 1726882706.90141: getting the next task for host managed_node1 30529 1726882706.90153: done getting next task for host managed_node1 30529 1726882706.90156: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882706.90162: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.90180: getting variables 30529 1726882706.90182: in VariableManager get_vars() 30529 1726882706.90333: Calling all_inventory to load vars for managed_node1 30529 1726882706.90336: Calling groups_inventory to load vars for managed_node1 30529 1726882706.90339: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.90346: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a7 30529 1726882706.90357: WORKER PROCESS EXITING 30529 1726882706.90369: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.90372: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.90376: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.91630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.92484: done with get_vars() 30529 1726882706.92505: done getting variables 30529 1726882706.92547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:26 -0400 (0:00:00.043) 0:02:00.951 ****** 30529 1726882706.92572: entering _queue_task() for managed_node1/debug 30529 1726882706.92820: worker is 1 (out of 1 available) 30529 1726882706.92832: exiting _queue_task() for managed_node1/debug 30529 1726882706.92844: done queuing things up, now waiting for results queue to drain 30529 1726882706.92846: waiting for pending results... 30529 1726882706.93042: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30529 1726882706.93145: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a8 30529 1726882706.93156: variable 'ansible_search_path' from source: unknown 30529 1726882706.93160: variable 'ansible_search_path' from source: unknown 30529 1726882706.93190: calling self._execute() 30529 1726882706.93269: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.93273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.93281: variable 'omit' from source: magic vars 30529 1726882706.93574: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.93583: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.93677: variable 'network_state' from source: role '' defaults 30529 1726882706.93686: Evaluated conditional (network_state != {}): False 30529 1726882706.93689: when evaluation is False, skipping this task 30529 1726882706.93695: _execute() done 30529 1726882706.93697: dumping result to json 30529 1726882706.93700: done dumping result, returning 30529 1726882706.93709: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-b0f1-edc0-0000000026a8] 30529 1726882706.93713: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a8 30529 1726882706.93800: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a8 30529 1726882706.93803: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 30529 1726882706.93876: no more pending results, returning what we have 30529 1726882706.93880: results queue empty 30529 1726882706.93881: checking for any_errors_fatal 30529 1726882706.93894: done checking for any_errors_fatal 30529 1726882706.93895: checking for max_fail_percentage 30529 1726882706.93897: done checking for max_fail_percentage 30529 1726882706.93898: checking to see if all hosts have failed and the running result is not ok 30529 1726882706.93898: done checking to see if all hosts have failed 30529 1726882706.93899: getting the remaining hosts for this loop 30529 1726882706.93901: done getting the remaining hosts for this loop 30529 1726882706.93904: getting the next task for host managed_node1 30529 1726882706.93912: done getting next task for host managed_node1 30529 1726882706.93916: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882706.93920: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882706.93946: getting variables 30529 1726882706.93948: in VariableManager get_vars() 30529 1726882706.93988: Calling all_inventory to load vars for managed_node1 30529 1726882706.93991: Calling groups_inventory to load vars for managed_node1 30529 1726882706.93997: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882706.94007: Calling all_plugins_play to load vars for managed_node1 30529 1726882706.94010: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882706.94012: Calling groups_plugins_play to load vars for managed_node1 30529 1726882706.94951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882706.95796: done with get_vars() 30529 1726882706.95813: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:26 -0400 (0:00:00.033) 0:02:00.984 ****** 30529 1726882706.95883: entering _queue_task() for managed_node1/ping 30529 1726882706.96139: worker is 1 (out of 1 available) 30529 1726882706.96152: exiting _queue_task() for managed_node1/ping 30529 1726882706.96166: done queuing things up, now waiting for results queue to drain 30529 1726882706.96167: waiting for pending results... 30529 1726882706.96364: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 30529 1726882706.96468: in run() - task 12673a56-9f93-b0f1-edc0-0000000026a9 30529 1726882706.96480: variable 'ansible_search_path' from source: unknown 30529 1726882706.96483: variable 'ansible_search_path' from source: unknown 30529 1726882706.96516: calling self._execute() 30529 1726882706.96588: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.96596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.96607: variable 'omit' from source: magic vars 30529 1726882706.96882: variable 'ansible_distribution_major_version' from source: facts 30529 1726882706.96896: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882706.96902: variable 'omit' from source: magic vars 30529 1726882706.96950: variable 'omit' from source: magic vars 30529 1726882706.96971: variable 'omit' from source: magic vars 30529 1726882706.97007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882706.97034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882706.97053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882706.97066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.97076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882706.97103: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882706.97107: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.97109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.97180: Set connection var ansible_shell_executable to /bin/sh 30529 1726882706.97183: Set connection var ansible_pipelining to False 30529 1726882706.97186: Set connection var ansible_shell_type to sh 30529 1726882706.97197: Set connection var ansible_timeout to 10 30529 1726882706.97200: Set connection var ansible_connection to ssh 30529 1726882706.97205: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882706.97223: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.97225: variable 'ansible_connection' from source: unknown 30529 1726882706.97228: variable 'ansible_module_compression' from source: unknown 30529 1726882706.97230: variable 'ansible_shell_type' from source: unknown 30529 1726882706.97233: variable 'ansible_shell_executable' from source: unknown 30529 1726882706.97235: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882706.97239: variable 'ansible_pipelining' from source: unknown 30529 1726882706.97241: variable 'ansible_timeout' from source: unknown 30529 1726882706.97245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882706.97390: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882706.97403: variable 'omit' from source: magic vars 30529 1726882706.97407: starting attempt loop 30529 1726882706.97410: running the handler 30529 1726882706.97422: _low_level_execute_command(): starting 30529 1726882706.97429: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882706.97934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882706.97938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.97941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882706.97943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882706.97999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882706.98002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882706.98012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882706.98053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882706.99644: stdout chunk (state=3): >>>/root <<< 30529 1726882706.99740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882706.99770: stderr chunk (state=3): >>><<< 30529 1726882706.99774: stdout chunk (state=3): >>><<< 30529 1726882706.99798: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882706.99811: _low_level_execute_command(): starting 30529 1726882706.99817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141 `" && echo ansible-tmp-1726882706.997986-35808-189688448819141="` echo /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141 `" ) && sleep 0' 30529 1726882707.00266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.00269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882707.00271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.00283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.00285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.00333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.00336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.00384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.02234: stdout chunk (state=3): >>>ansible-tmp-1726882706.997986-35808-189688448819141=/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141 <<< 30529 1726882707.02342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.02371: stderr chunk (state=3): >>><<< 30529 1726882707.02374: stdout chunk (state=3): >>><<< 30529 1726882707.02389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882706.997986-35808-189688448819141=/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.02435: variable 'ansible_module_compression' from source: unknown 30529 1726882707.02473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30529 1726882707.02506: variable 'ansible_facts' from source: unknown 30529 1726882707.02558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py 30529 1726882707.02666: Sending initial data 30529 1726882707.02669: Sent initial data (152 bytes) 30529 1726882707.03124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.03127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.03129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.03132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.03185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.03196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.03198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.03235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.04754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882707.04791: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882707.04834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpna89xveo /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py <<< 30529 1726882707.04837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py" <<< 30529 1726882707.04876: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpna89xveo" to remote "/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py" <<< 30529 1726882707.04882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py" <<< 30529 1726882707.05372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.05421: stderr chunk (state=3): >>><<< 30529 1726882707.05425: stdout chunk (state=3): >>><<< 30529 1726882707.05446: done transferring module to remote 30529 1726882707.05456: _low_level_execute_command(): starting 30529 1726882707.05461: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/ /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py && sleep 0' 30529 1726882707.05913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.05916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.05918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.05920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.05976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.05979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.05987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.06026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.07731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.07760: stderr chunk (state=3): >>><<< 30529 1726882707.07763: stdout chunk (state=3): >>><<< 30529 1726882707.07781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.07784: _low_level_execute_command(): starting 30529 1726882707.07791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/AnsiballZ_ping.py && sleep 0' 30529 1726882707.08244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.08247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882707.08249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.08251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.08253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.08299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.08314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.08360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.23147: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30529 1726882707.24316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882707.24347: stderr chunk (state=3): >>><<< 30529 1726882707.24351: stdout chunk (state=3): >>><<< 30529 1726882707.24365: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882707.24387: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882707.24400: _low_level_execute_command(): starting 30529 1726882707.24404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882706.997986-35808-189688448819141/ > /dev/null 2>&1 && sleep 0' 30529 1726882707.24847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.24851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.24864: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.24923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.24927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.24929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.24976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.26755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.26780: stderr chunk (state=3): >>><<< 30529 1726882707.26787: stdout chunk (state=3): >>><<< 30529 1726882707.26802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.26808: handler run complete 30529 1726882707.26820: attempt loop complete, returning result 30529 1726882707.26823: _execute() done 30529 1726882707.26826: dumping result to json 30529 1726882707.26828: done dumping result, returning 30529 1726882707.26836: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-b0f1-edc0-0000000026a9] 30529 1726882707.26840: sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a9 30529 1726882707.26927: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000026a9 30529 1726882707.26930: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 30529 1726882707.26998: no more pending results, returning what we have 30529 1726882707.27002: results queue empty 30529 1726882707.27003: checking for any_errors_fatal 30529 1726882707.27012: done checking for any_errors_fatal 30529 1726882707.27013: checking for max_fail_percentage 30529 1726882707.27014: done checking for max_fail_percentage 30529 1726882707.27015: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.27016: done checking to see if all hosts have failed 30529 1726882707.27017: getting the remaining hosts for this loop 30529 1726882707.27018: done getting the remaining hosts for this loop 30529 1726882707.27021: getting the next task for host managed_node1 30529 1726882707.27034: done getting next task for host managed_node1 30529 1726882707.27036: ^ task is: TASK: meta (role_complete) 30529 1726882707.27041: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.27056: getting variables 30529 1726882707.27058: in VariableManager get_vars() 30529 1726882707.27109: Calling all_inventory to load vars for managed_node1 30529 1726882707.27112: Calling groups_inventory to load vars for managed_node1 30529 1726882707.27114: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.27124: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.27126: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.27129: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.27953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.28809: done with get_vars() 30529 1726882707.28830: done getting variables 30529 1726882707.28888: done queuing things up, now waiting for results queue to drain 30529 1726882707.28890: results queue empty 30529 1726882707.28891: checking for any_errors_fatal 30529 1726882707.28894: done checking for any_errors_fatal 30529 1726882707.28895: checking for max_fail_percentage 30529 1726882707.28895: done checking for max_fail_percentage 30529 1726882707.28896: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.28896: done checking to see if all hosts have failed 30529 1726882707.28897: getting the remaining hosts for this loop 30529 1726882707.28897: done getting the remaining hosts for this loop 30529 1726882707.28899: getting the next task for host managed_node1 30529 1726882707.28903: done getting next task for host managed_node1 30529 1726882707.28905: ^ task is: TASK: Asserts 30529 1726882707.28906: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.28908: getting variables 30529 1726882707.28909: in VariableManager get_vars() 30529 1726882707.28918: Calling all_inventory to load vars for managed_node1 30529 1726882707.28919: Calling groups_inventory to load vars for managed_node1 30529 1726882707.28921: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.28924: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.28926: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.28927: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.29640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.30473: done with get_vars() 30529 1726882707.30487: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:38:27 -0400 (0:00:00.346) 0:02:01.331 ****** 30529 1726882707.30541: entering _queue_task() for managed_node1/include_tasks 30529 1726882707.30861: worker is 1 (out of 1 available) 30529 1726882707.30875: exiting _queue_task() for managed_node1/include_tasks 30529 1726882707.30888: done queuing things up, now waiting for results queue to drain 30529 1726882707.30890: waiting for pending results... 30529 1726882707.31086: running TaskExecutor() for managed_node1/TASK: Asserts 30529 1726882707.31174: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b2 30529 1726882707.31188: variable 'ansible_search_path' from source: unknown 30529 1726882707.31192: variable 'ansible_search_path' from source: unknown 30529 1726882707.31233: variable 'lsr_assert' from source: include params 30529 1726882707.31404: variable 'lsr_assert' from source: include params 30529 1726882707.31458: variable 'omit' from source: magic vars 30529 1726882707.31560: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.31567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.31577: variable 'omit' from source: magic vars 30529 1726882707.31748: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.31755: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.31762: variable 'item' from source: unknown 30529 1726882707.31812: variable 'item' from source: unknown 30529 1726882707.31835: variable 'item' from source: unknown 30529 1726882707.31882: variable 'item' from source: unknown 30529 1726882707.32015: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.32019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.32021: variable 'omit' from source: magic vars 30529 1726882707.32089: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.32096: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.32102: variable 'item' from source: unknown 30529 1726882707.32147: variable 'item' from source: unknown 30529 1726882707.32166: variable 'item' from source: unknown 30529 1726882707.32212: variable 'item' from source: unknown 30529 1726882707.32274: dumping result to json 30529 1726882707.32277: done dumping result, returning 30529 1726882707.32279: done running TaskExecutor() for managed_node1/TASK: Asserts [12673a56-9f93-b0f1-edc0-0000000020b2] 30529 1726882707.32281: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b2 30529 1726882707.32314: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b2 30529 1726882707.32318: WORKER PROCESS EXITING 30529 1726882707.32341: no more pending results, returning what we have 30529 1726882707.32345: in VariableManager get_vars() 30529 1726882707.32389: Calling all_inventory to load vars for managed_node1 30529 1726882707.32391: Calling groups_inventory to load vars for managed_node1 30529 1726882707.32396: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.32410: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.32413: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.32416: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.33246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.34183: done with get_vars() 30529 1726882707.34199: variable 'ansible_search_path' from source: unknown 30529 1726882707.34200: variable 'ansible_search_path' from source: unknown 30529 1726882707.34228: variable 'ansible_search_path' from source: unknown 30529 1726882707.34229: variable 'ansible_search_path' from source: unknown 30529 1726882707.34245: we have included files to process 30529 1726882707.34246: generating all_blocks data 30529 1726882707.34247: done generating all_blocks data 30529 1726882707.34252: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882707.34253: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882707.34254: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30529 1726882707.34323: in VariableManager get_vars() 30529 1726882707.34339: done with get_vars() 30529 1726882707.34415: done processing included file 30529 1726882707.34417: iterating over new_blocks loaded from include file 30529 1726882707.34418: in VariableManager get_vars() 30529 1726882707.34428: done with get_vars() 30529 1726882707.34429: filtering new block on tags 30529 1726882707.34453: done filtering new block on tags 30529 1726882707.34454: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 => (item=tasks/assert_profile_absent.yml) 30529 1726882707.34457: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30529 1726882707.34458: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30529 1726882707.34460: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30529 1726882707.34694: done processing included file 30529 1726882707.34696: iterating over new_blocks loaded from include file 30529 1726882707.34696: in VariableManager get_vars() 30529 1726882707.34707: done with get_vars() 30529 1726882707.34708: filtering new block on tags 30529 1726882707.34733: done filtering new block on tags 30529 1726882707.34734: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node1 => (item=tasks/get_NetworkManager_NVR.yml) 30529 1726882707.34737: extending task lists for all hosts with included blocks 30529 1726882707.35338: done extending task lists 30529 1726882707.35339: done processing included files 30529 1726882707.35340: results queue empty 30529 1726882707.35340: checking for any_errors_fatal 30529 1726882707.35341: done checking for any_errors_fatal 30529 1726882707.35342: checking for max_fail_percentage 30529 1726882707.35343: done checking for max_fail_percentage 30529 1726882707.35343: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.35344: done checking to see if all hosts have failed 30529 1726882707.35344: getting the remaining hosts for this loop 30529 1726882707.35345: done getting the remaining hosts for this loop 30529 1726882707.35346: getting the next task for host managed_node1 30529 1726882707.35349: done getting next task for host managed_node1 30529 1726882707.35351: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30529 1726882707.35353: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.35354: getting variables 30529 1726882707.35360: in VariableManager get_vars() 30529 1726882707.35367: Calling all_inventory to load vars for managed_node1 30529 1726882707.35368: Calling groups_inventory to load vars for managed_node1 30529 1726882707.35370: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.35373: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.35375: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.35376: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.35986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.36809: done with get_vars() 30529 1726882707.36826: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:38:27 -0400 (0:00:00.063) 0:02:01.394 ****** 30529 1726882707.36873: entering _queue_task() for managed_node1/include_tasks 30529 1726882707.37122: worker is 1 (out of 1 available) 30529 1726882707.37132: exiting _queue_task() for managed_node1/include_tasks 30529 1726882707.37146: done queuing things up, now waiting for results queue to drain 30529 1726882707.37147: waiting for pending results... 30529 1726882707.37337: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 30529 1726882707.37418: in run() - task 12673a56-9f93-b0f1-edc0-000000002804 30529 1726882707.37430: variable 'ansible_search_path' from source: unknown 30529 1726882707.37433: variable 'ansible_search_path' from source: unknown 30529 1726882707.37459: calling self._execute() 30529 1726882707.37536: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.37540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.37547: variable 'omit' from source: magic vars 30529 1726882707.37825: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.37835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.37840: _execute() done 30529 1726882707.37843: dumping result to json 30529 1726882707.37846: done dumping result, returning 30529 1726882707.37852: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-b0f1-edc0-000000002804] 30529 1726882707.37857: sending task result for task 12673a56-9f93-b0f1-edc0-000000002804 30529 1726882707.37941: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002804 30529 1726882707.37943: WORKER PROCESS EXITING 30529 1726882707.37971: no more pending results, returning what we have 30529 1726882707.37976: in VariableManager get_vars() 30529 1726882707.38024: Calling all_inventory to load vars for managed_node1 30529 1726882707.38027: Calling groups_inventory to load vars for managed_node1 30529 1726882707.38030: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.38042: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.38045: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.38047: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.42958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.43779: done with get_vars() 30529 1726882707.43797: variable 'ansible_search_path' from source: unknown 30529 1726882707.43798: variable 'ansible_search_path' from source: unknown 30529 1726882707.43805: variable 'item' from source: include params 30529 1726882707.43864: variable 'item' from source: include params 30529 1726882707.43885: we have included files to process 30529 1726882707.43886: generating all_blocks data 30529 1726882707.43887: done generating all_blocks data 30529 1726882707.43888: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882707.43889: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882707.43890: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30529 1726882707.44446: done processing included file 30529 1726882707.44447: iterating over new_blocks loaded from include file 30529 1726882707.44448: in VariableManager get_vars() 30529 1726882707.44460: done with get_vars() 30529 1726882707.44461: filtering new block on tags 30529 1726882707.44502: done filtering new block on tags 30529 1726882707.44505: in VariableManager get_vars() 30529 1726882707.44515: done with get_vars() 30529 1726882707.44516: filtering new block on tags 30529 1726882707.44545: done filtering new block on tags 30529 1726882707.44547: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 30529 1726882707.44550: extending task lists for all hosts with included blocks 30529 1726882707.44680: done extending task lists 30529 1726882707.44681: done processing included files 30529 1726882707.44681: results queue empty 30529 1726882707.44682: checking for any_errors_fatal 30529 1726882707.44684: done checking for any_errors_fatal 30529 1726882707.44684: checking for max_fail_percentage 30529 1726882707.44685: done checking for max_fail_percentage 30529 1726882707.44685: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.44686: done checking to see if all hosts have failed 30529 1726882707.44686: getting the remaining hosts for this loop 30529 1726882707.44687: done getting the remaining hosts for this loop 30529 1726882707.44689: getting the next task for host managed_node1 30529 1726882707.44692: done getting next task for host managed_node1 30529 1726882707.44695: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882707.44697: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.44698: getting variables 30529 1726882707.44699: in VariableManager get_vars() 30529 1726882707.44707: Calling all_inventory to load vars for managed_node1 30529 1726882707.44709: Calling groups_inventory to load vars for managed_node1 30529 1726882707.44711: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.44714: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.44716: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.44717: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.45320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.46147: done with get_vars() 30529 1726882707.46160: done getting variables 30529 1726882707.46184: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:38:27 -0400 (0:00:00.093) 0:02:01.488 ****** 30529 1726882707.46205: entering _queue_task() for managed_node1/set_fact 30529 1726882707.46482: worker is 1 (out of 1 available) 30529 1726882707.46495: exiting _queue_task() for managed_node1/set_fact 30529 1726882707.46509: done queuing things up, now waiting for results queue to drain 30529 1726882707.46510: waiting for pending results... 30529 1726882707.46699: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 30529 1726882707.46785: in run() - task 12673a56-9f93-b0f1-edc0-000000002888 30529 1726882707.46801: variable 'ansible_search_path' from source: unknown 30529 1726882707.46805: variable 'ansible_search_path' from source: unknown 30529 1726882707.46833: calling self._execute() 30529 1726882707.46904: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.46908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.46917: variable 'omit' from source: magic vars 30529 1726882707.47191: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.47205: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.47211: variable 'omit' from source: magic vars 30529 1726882707.47246: variable 'omit' from source: magic vars 30529 1726882707.47268: variable 'omit' from source: magic vars 30529 1726882707.47304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882707.47330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882707.47346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882707.47359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.47369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.47397: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882707.47400: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.47402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.47471: Set connection var ansible_shell_executable to /bin/sh 30529 1726882707.47475: Set connection var ansible_pipelining to False 30529 1726882707.47477: Set connection var ansible_shell_type to sh 30529 1726882707.47485: Set connection var ansible_timeout to 10 30529 1726882707.47488: Set connection var ansible_connection to ssh 30529 1726882707.47501: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882707.47514: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.47516: variable 'ansible_connection' from source: unknown 30529 1726882707.47519: variable 'ansible_module_compression' from source: unknown 30529 1726882707.47521: variable 'ansible_shell_type' from source: unknown 30529 1726882707.47523: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.47526: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.47530: variable 'ansible_pipelining' from source: unknown 30529 1726882707.47532: variable 'ansible_timeout' from source: unknown 30529 1726882707.47536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.47635: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882707.47645: variable 'omit' from source: magic vars 30529 1726882707.47650: starting attempt loop 30529 1726882707.47653: running the handler 30529 1726882707.47664: handler run complete 30529 1726882707.47672: attempt loop complete, returning result 30529 1726882707.47674: _execute() done 30529 1726882707.47677: dumping result to json 30529 1726882707.47679: done dumping result, returning 30529 1726882707.47686: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-b0f1-edc0-000000002888] 30529 1726882707.47694: sending task result for task 12673a56-9f93-b0f1-edc0-000000002888 30529 1726882707.47768: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002888 30529 1726882707.47771: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30529 1726882707.47821: no more pending results, returning what we have 30529 1726882707.47825: results queue empty 30529 1726882707.47826: checking for any_errors_fatal 30529 1726882707.47828: done checking for any_errors_fatal 30529 1726882707.47828: checking for max_fail_percentage 30529 1726882707.47830: done checking for max_fail_percentage 30529 1726882707.47831: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.47832: done checking to see if all hosts have failed 30529 1726882707.47832: getting the remaining hosts for this loop 30529 1726882707.47834: done getting the remaining hosts for this loop 30529 1726882707.47837: getting the next task for host managed_node1 30529 1726882707.47846: done getting next task for host managed_node1 30529 1726882707.47849: ^ task is: TASK: Stat profile file 30529 1726882707.47853: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.47857: getting variables 30529 1726882707.47859: in VariableManager get_vars() 30529 1726882707.47905: Calling all_inventory to load vars for managed_node1 30529 1726882707.47907: Calling groups_inventory to load vars for managed_node1 30529 1726882707.47911: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.47920: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.47923: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.47925: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.48743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.49595: done with get_vars() 30529 1726882707.49609: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:38:27 -0400 (0:00:00.034) 0:02:01.522 ****** 30529 1726882707.49670: entering _queue_task() for managed_node1/stat 30529 1726882707.49883: worker is 1 (out of 1 available) 30529 1726882707.49897: exiting _queue_task() for managed_node1/stat 30529 1726882707.49910: done queuing things up, now waiting for results queue to drain 30529 1726882707.49913: waiting for pending results... 30529 1726882707.50084: running TaskExecutor() for managed_node1/TASK: Stat profile file 30529 1726882707.50167: in run() - task 12673a56-9f93-b0f1-edc0-000000002889 30529 1726882707.50180: variable 'ansible_search_path' from source: unknown 30529 1726882707.50183: variable 'ansible_search_path' from source: unknown 30529 1726882707.50218: calling self._execute() 30529 1726882707.50290: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.50297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.50361: variable 'omit' from source: magic vars 30529 1726882707.50574: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.50590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.50600: variable 'omit' from source: magic vars 30529 1726882707.50634: variable 'omit' from source: magic vars 30529 1726882707.50710: variable 'profile' from source: play vars 30529 1726882707.50713: variable 'interface' from source: play vars 30529 1726882707.50758: variable 'interface' from source: play vars 30529 1726882707.50772: variable 'omit' from source: magic vars 30529 1726882707.50808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882707.50835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882707.50851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882707.50864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.50875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.50903: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882707.50907: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.50909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.50978: Set connection var ansible_shell_executable to /bin/sh 30529 1726882707.50981: Set connection var ansible_pipelining to False 30529 1726882707.50984: Set connection var ansible_shell_type to sh 30529 1726882707.50996: Set connection var ansible_timeout to 10 30529 1726882707.50999: Set connection var ansible_connection to ssh 30529 1726882707.51004: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882707.51025: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.51028: variable 'ansible_connection' from source: unknown 30529 1726882707.51031: variable 'ansible_module_compression' from source: unknown 30529 1726882707.51034: variable 'ansible_shell_type' from source: unknown 30529 1726882707.51036: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.51038: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.51041: variable 'ansible_pipelining' from source: unknown 30529 1726882707.51043: variable 'ansible_timeout' from source: unknown 30529 1726882707.51045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.51183: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882707.51195: variable 'omit' from source: magic vars 30529 1726882707.51201: starting attempt loop 30529 1726882707.51204: running the handler 30529 1726882707.51217: _low_level_execute_command(): starting 30529 1726882707.51225: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882707.51741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.51745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882707.51748: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882707.51750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.51799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.51802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.51814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.51856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.53503: stdout chunk (state=3): >>>/root <<< 30529 1726882707.53601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.53631: stderr chunk (state=3): >>><<< 30529 1726882707.53634: stdout chunk (state=3): >>><<< 30529 1726882707.53655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.53670: _low_level_execute_command(): starting 30529 1726882707.53676: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658 `" && echo ansible-tmp-1726882707.5365534-35819-175662557452658="` echo /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658 `" ) && sleep 0' 30529 1726882707.54114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.54119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882707.54128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.54130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.54132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882707.54134: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.54177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.54180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.54184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.54226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.56077: stdout chunk (state=3): >>>ansible-tmp-1726882707.5365534-35819-175662557452658=/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658 <<< 30529 1726882707.56188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.56216: stderr chunk (state=3): >>><<< 30529 1726882707.56219: stdout chunk (state=3): >>><<< 30529 1726882707.56235: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882707.5365534-35819-175662557452658=/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.56272: variable 'ansible_module_compression' from source: unknown 30529 1726882707.56320: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882707.56350: variable 'ansible_facts' from source: unknown 30529 1726882707.56413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py 30529 1726882707.56512: Sending initial data 30529 1726882707.56515: Sent initial data (153 bytes) 30529 1726882707.56942: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.56946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882707.56948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.56950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.56952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.57004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.57011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.57050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.58553: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882707.58559: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882707.58595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882707.58635: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp59zlv8jd /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py <<< 30529 1726882707.58638: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py" <<< 30529 1726882707.58674: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp59zlv8jd" to remote "/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py" <<< 30529 1726882707.59181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.59222: stderr chunk (state=3): >>><<< 30529 1726882707.59226: stdout chunk (state=3): >>><<< 30529 1726882707.59261: done transferring module to remote 30529 1726882707.59270: _low_level_execute_command(): starting 30529 1726882707.59274: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/ /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py && sleep 0' 30529 1726882707.59703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.59707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.59710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882707.59729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.59772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.59775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.59820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.61543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.61568: stderr chunk (state=3): >>><<< 30529 1726882707.61571: stdout chunk (state=3): >>><<< 30529 1726882707.61590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.61596: _low_level_execute_command(): starting 30529 1726882707.61598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/AnsiballZ_stat.py && sleep 0' 30529 1726882707.62044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.62048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.62050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.62052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882707.62054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.62108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.62111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.62115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.62160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.77074: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882707.78204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882707.78229: stderr chunk (state=3): >>><<< 30529 1726882707.78232: stdout chunk (state=3): >>><<< 30529 1726882707.78247: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882707.78273: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882707.78285: _low_level_execute_command(): starting 30529 1726882707.78289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882707.5365534-35819-175662557452658/ > /dev/null 2>&1 && sleep 0' 30529 1726882707.78745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.78748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.78750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882707.78756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.78759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.78804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.78807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.78854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.80621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.80644: stderr chunk (state=3): >>><<< 30529 1726882707.80647: stdout chunk (state=3): >>><<< 30529 1726882707.80659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.80667: handler run complete 30529 1726882707.80684: attempt loop complete, returning result 30529 1726882707.80686: _execute() done 30529 1726882707.80691: dumping result to json 30529 1726882707.80696: done dumping result, returning 30529 1726882707.80701: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-b0f1-edc0-000000002889] 30529 1726882707.80706: sending task result for task 12673a56-9f93-b0f1-edc0-000000002889 30529 1726882707.80802: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002889 30529 1726882707.80806: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882707.80859: no more pending results, returning what we have 30529 1726882707.80862: results queue empty 30529 1726882707.80863: checking for any_errors_fatal 30529 1726882707.80872: done checking for any_errors_fatal 30529 1726882707.80873: checking for max_fail_percentage 30529 1726882707.80875: done checking for max_fail_percentage 30529 1726882707.80875: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.80876: done checking to see if all hosts have failed 30529 1726882707.80877: getting the remaining hosts for this loop 30529 1726882707.80878: done getting the remaining hosts for this loop 30529 1726882707.80882: getting the next task for host managed_node1 30529 1726882707.80895: done getting next task for host managed_node1 30529 1726882707.80897: ^ task is: TASK: Set NM profile exist flag based on the profile files 30529 1726882707.80902: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.80907: getting variables 30529 1726882707.80908: in VariableManager get_vars() 30529 1726882707.80955: Calling all_inventory to load vars for managed_node1 30529 1726882707.80957: Calling groups_inventory to load vars for managed_node1 30529 1726882707.80961: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.80971: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.80973: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.80975: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.81810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.82779: done with get_vars() 30529 1726882707.82798: done getting variables 30529 1726882707.82841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:38:27 -0400 (0:00:00.331) 0:02:01.854 ****** 30529 1726882707.82865: entering _queue_task() for managed_node1/set_fact 30529 1726882707.83098: worker is 1 (out of 1 available) 30529 1726882707.83109: exiting _queue_task() for managed_node1/set_fact 30529 1726882707.83122: done queuing things up, now waiting for results queue to drain 30529 1726882707.83124: waiting for pending results... 30529 1726882707.83301: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 30529 1726882707.83379: in run() - task 12673a56-9f93-b0f1-edc0-00000000288a 30529 1726882707.83395: variable 'ansible_search_path' from source: unknown 30529 1726882707.83398: variable 'ansible_search_path' from source: unknown 30529 1726882707.83424: calling self._execute() 30529 1726882707.83502: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.83507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.83516: variable 'omit' from source: magic vars 30529 1726882707.83902: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.83906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.83909: variable 'profile_stat' from source: set_fact 30529 1726882707.83911: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882707.83914: when evaluation is False, skipping this task 30529 1726882707.83917: _execute() done 30529 1726882707.83920: dumping result to json 30529 1726882707.83922: done dumping result, returning 30529 1726882707.83924: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-b0f1-edc0-00000000288a] 30529 1726882707.83926: sending task result for task 12673a56-9f93-b0f1-edc0-00000000288a 30529 1726882707.84011: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000288a 30529 1726882707.84013: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882707.84057: no more pending results, returning what we have 30529 1726882707.84060: results queue empty 30529 1726882707.84061: checking for any_errors_fatal 30529 1726882707.84070: done checking for any_errors_fatal 30529 1726882707.84071: checking for max_fail_percentage 30529 1726882707.84072: done checking for max_fail_percentage 30529 1726882707.84073: checking to see if all hosts have failed and the running result is not ok 30529 1726882707.84074: done checking to see if all hosts have failed 30529 1726882707.84075: getting the remaining hosts for this loop 30529 1726882707.84077: done getting the remaining hosts for this loop 30529 1726882707.84080: getting the next task for host managed_node1 30529 1726882707.84090: done getting next task for host managed_node1 30529 1726882707.84094: ^ task is: TASK: Get NM profile info 30529 1726882707.84099: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882707.84101: getting variables 30529 1726882707.84103: in VariableManager get_vars() 30529 1726882707.84140: Calling all_inventory to load vars for managed_node1 30529 1726882707.84142: Calling groups_inventory to load vars for managed_node1 30529 1726882707.84145: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882707.84154: Calling all_plugins_play to load vars for managed_node1 30529 1726882707.84157: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882707.84160: Calling groups_plugins_play to load vars for managed_node1 30529 1726882707.84915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882707.85783: done with get_vars() 30529 1726882707.85802: done getting variables 30529 1726882707.85843: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:38:27 -0400 (0:00:00.030) 0:02:01.884 ****** 30529 1726882707.85867: entering _queue_task() for managed_node1/shell 30529 1726882707.86076: worker is 1 (out of 1 available) 30529 1726882707.86091: exiting _queue_task() for managed_node1/shell 30529 1726882707.86103: done queuing things up, now waiting for results queue to drain 30529 1726882707.86105: waiting for pending results... 30529 1726882707.86269: running TaskExecutor() for managed_node1/TASK: Get NM profile info 30529 1726882707.86351: in run() - task 12673a56-9f93-b0f1-edc0-00000000288b 30529 1726882707.86365: variable 'ansible_search_path' from source: unknown 30529 1726882707.86369: variable 'ansible_search_path' from source: unknown 30529 1726882707.86397: calling self._execute() 30529 1726882707.86467: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.86471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.86480: variable 'omit' from source: magic vars 30529 1726882707.86747: variable 'ansible_distribution_major_version' from source: facts 30529 1726882707.86755: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882707.86763: variable 'omit' from source: magic vars 30529 1726882707.86801: variable 'omit' from source: magic vars 30529 1726882707.86870: variable 'profile' from source: play vars 30529 1726882707.86875: variable 'interface' from source: play vars 30529 1726882707.86922: variable 'interface' from source: play vars 30529 1726882707.86936: variable 'omit' from source: magic vars 30529 1726882707.86970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882707.86998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882707.87015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882707.87028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.87039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882707.87061: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882707.87064: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.87067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.87140: Set connection var ansible_shell_executable to /bin/sh 30529 1726882707.87144: Set connection var ansible_pipelining to False 30529 1726882707.87147: Set connection var ansible_shell_type to sh 30529 1726882707.87155: Set connection var ansible_timeout to 10 30529 1726882707.87157: Set connection var ansible_connection to ssh 30529 1726882707.87162: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882707.87177: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.87180: variable 'ansible_connection' from source: unknown 30529 1726882707.87183: variable 'ansible_module_compression' from source: unknown 30529 1726882707.87185: variable 'ansible_shell_type' from source: unknown 30529 1726882707.87191: variable 'ansible_shell_executable' from source: unknown 30529 1726882707.87195: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882707.87197: variable 'ansible_pipelining' from source: unknown 30529 1726882707.87200: variable 'ansible_timeout' from source: unknown 30529 1726882707.87203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882707.87298: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882707.87307: variable 'omit' from source: magic vars 30529 1726882707.87314: starting attempt loop 30529 1726882707.87317: running the handler 30529 1726882707.87327: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882707.87344: _low_level_execute_command(): starting 30529 1726882707.87351: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882707.87858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.87864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.87867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.87870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.87925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.87928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.87931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.87981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.89565: stdout chunk (state=3): >>>/root <<< 30529 1726882707.89662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.89691: stderr chunk (state=3): >>><<< 30529 1726882707.89696: stdout chunk (state=3): >>><<< 30529 1726882707.89716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.89727: _low_level_execute_command(): starting 30529 1726882707.89732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724 `" && echo ansible-tmp-1726882707.8971536-35828-159809992851724="` echo /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724 `" ) && sleep 0' 30529 1726882707.90161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.90164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.90166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.90168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882707.90170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.90216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.90219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.90271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.92134: stdout chunk (state=3): >>>ansible-tmp-1726882707.8971536-35828-159809992851724=/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724 <<< 30529 1726882707.92230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.92252: stderr chunk (state=3): >>><<< 30529 1726882707.92255: stdout chunk (state=3): >>><<< 30529 1726882707.92270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882707.8971536-35828-159809992851724=/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.92300: variable 'ansible_module_compression' from source: unknown 30529 1726882707.92338: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882707.92366: variable 'ansible_facts' from source: unknown 30529 1726882707.92426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py 30529 1726882707.92519: Sending initial data 30529 1726882707.92522: Sent initial data (156 bytes) 30529 1726882707.92937: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.92940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882707.92942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.92945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.92946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.92999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.93006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.93043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.94565: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882707.94570: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882707.94602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882707.94646: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9x1faj59 /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py <<< 30529 1726882707.94649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py" <<< 30529 1726882707.94686: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp9x1faj59" to remote "/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py" <<< 30529 1726882707.94691: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py" <<< 30529 1726882707.95218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.95254: stderr chunk (state=3): >>><<< 30529 1726882707.95257: stdout chunk (state=3): >>><<< 30529 1726882707.95272: done transferring module to remote 30529 1726882707.95280: _low_level_execute_command(): starting 30529 1726882707.95284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/ /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py && sleep 0' 30529 1726882707.95694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.95701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.95703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.95705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882707.95707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.95748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.95751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.95802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882707.97501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882707.97528: stderr chunk (state=3): >>><<< 30529 1726882707.97531: stdout chunk (state=3): >>><<< 30529 1726882707.97548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882707.97552: _low_level_execute_command(): starting 30529 1726882707.97554: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/AnsiballZ_command.py && sleep 0' 30529 1726882707.97989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882707.97994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.97997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882707.97999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882707.98001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882707.98054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882707.98057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882707.98061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882707.98109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.14537: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:38:28.128736", "end": "2024-09-20 21:38:28.144299", "delta": "0:00:00.015563", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882708.15900: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882708.15927: stderr chunk (state=3): >>><<< 30529 1726882708.15930: stdout chunk (state=3): >>><<< 30529 1726882708.15951: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:38:28.128736", "end": "2024-09-20 21:38:28.144299", "delta": "0:00:00.015563", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882708.15980: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882708.15989: _low_level_execute_command(): starting 30529 1726882708.15998: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882707.8971536-35828-159809992851724/ > /dev/null 2>&1 && sleep 0' 30529 1726882708.16448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882708.16451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882708.16453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.16456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.16458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.16498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882708.16512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.16561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.18365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.18389: stderr chunk (state=3): >>><<< 30529 1726882708.18395: stdout chunk (state=3): >>><<< 30529 1726882708.18412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882708.18419: handler run complete 30529 1726882708.18436: Evaluated conditional (False): False 30529 1726882708.18444: attempt loop complete, returning result 30529 1726882708.18447: _execute() done 30529 1726882708.18449: dumping result to json 30529 1726882708.18453: done dumping result, returning 30529 1726882708.18461: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-b0f1-edc0-00000000288b] 30529 1726882708.18465: sending task result for task 12673a56-9f93-b0f1-edc0-00000000288b 30529 1726882708.18566: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000288b 30529 1726882708.18569: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.015563", "end": "2024-09-20 21:38:28.144299", "rc": 1, "start": "2024-09-20 21:38:28.128736" } MSG: non-zero return code ...ignoring 30529 1726882708.18640: no more pending results, returning what we have 30529 1726882708.18643: results queue empty 30529 1726882708.18644: checking for any_errors_fatal 30529 1726882708.18653: done checking for any_errors_fatal 30529 1726882708.18654: checking for max_fail_percentage 30529 1726882708.18656: done checking for max_fail_percentage 30529 1726882708.18657: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.18658: done checking to see if all hosts have failed 30529 1726882708.18658: getting the remaining hosts for this loop 30529 1726882708.18660: done getting the remaining hosts for this loop 30529 1726882708.18664: getting the next task for host managed_node1 30529 1726882708.18672: done getting next task for host managed_node1 30529 1726882708.18675: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882708.18680: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.18684: getting variables 30529 1726882708.18685: in VariableManager get_vars() 30529 1726882708.18733: Calling all_inventory to load vars for managed_node1 30529 1726882708.18736: Calling groups_inventory to load vars for managed_node1 30529 1726882708.18739: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.18750: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.18753: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.18755: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.19741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.20624: done with get_vars() 30529 1726882708.20642: done getting variables 30529 1726882708.20684: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:38:28 -0400 (0:00:00.348) 0:02:02.233 ****** 30529 1726882708.20713: entering _queue_task() for managed_node1/set_fact 30529 1726882708.20964: worker is 1 (out of 1 available) 30529 1726882708.20977: exiting _queue_task() for managed_node1/set_fact 30529 1726882708.20994: done queuing things up, now waiting for results queue to drain 30529 1726882708.20996: waiting for pending results... 30529 1726882708.21171: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30529 1726882708.21264: in run() - task 12673a56-9f93-b0f1-edc0-00000000288c 30529 1726882708.21278: variable 'ansible_search_path' from source: unknown 30529 1726882708.21281: variable 'ansible_search_path' from source: unknown 30529 1726882708.21312: calling self._execute() 30529 1726882708.21383: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.21387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.21401: variable 'omit' from source: magic vars 30529 1726882708.21672: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.21682: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.21775: variable 'nm_profile_exists' from source: set_fact 30529 1726882708.21785: Evaluated conditional (nm_profile_exists.rc == 0): False 30529 1726882708.21791: when evaluation is False, skipping this task 30529 1726882708.21796: _execute() done 30529 1726882708.21799: dumping result to json 30529 1726882708.21802: done dumping result, returning 30529 1726882708.21807: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-b0f1-edc0-00000000288c] 30529 1726882708.21809: sending task result for task 12673a56-9f93-b0f1-edc0-00000000288c 30529 1726882708.21901: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000288c 30529 1726882708.21904: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30529 1726882708.21951: no more pending results, returning what we have 30529 1726882708.21957: results queue empty 30529 1726882708.21958: checking for any_errors_fatal 30529 1726882708.21968: done checking for any_errors_fatal 30529 1726882708.21969: checking for max_fail_percentage 30529 1726882708.21971: done checking for max_fail_percentage 30529 1726882708.21972: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.21973: done checking to see if all hosts have failed 30529 1726882708.21973: getting the remaining hosts for this loop 30529 1726882708.21976: done getting the remaining hosts for this loop 30529 1726882708.21980: getting the next task for host managed_node1 30529 1726882708.21996: done getting next task for host managed_node1 30529 1726882708.21998: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882708.22003: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.22008: getting variables 30529 1726882708.22010: in VariableManager get_vars() 30529 1726882708.22052: Calling all_inventory to load vars for managed_node1 30529 1726882708.22054: Calling groups_inventory to load vars for managed_node1 30529 1726882708.22058: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.22069: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.22071: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.22073: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.23362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.24224: done with get_vars() 30529 1726882708.24240: done getting variables 30529 1726882708.24283: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882708.24370: variable 'profile' from source: play vars 30529 1726882708.24373: variable 'interface' from source: play vars 30529 1726882708.24417: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:38:28 -0400 (0:00:00.037) 0:02:02.270 ****** 30529 1726882708.24441: entering _queue_task() for managed_node1/command 30529 1726882708.24690: worker is 1 (out of 1 available) 30529 1726882708.24705: exiting _queue_task() for managed_node1/command 30529 1726882708.24718: done queuing things up, now waiting for results queue to drain 30529 1726882708.24720: waiting for pending results... 30529 1726882708.24906: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr 30529 1726882708.24988: in run() - task 12673a56-9f93-b0f1-edc0-00000000288e 30529 1726882708.25005: variable 'ansible_search_path' from source: unknown 30529 1726882708.25010: variable 'ansible_search_path' from source: unknown 30529 1726882708.25035: calling self._execute() 30529 1726882708.25124: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.25127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.25153: variable 'omit' from source: magic vars 30529 1726882708.25661: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.25826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.25830: variable 'profile_stat' from source: set_fact 30529 1726882708.25832: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882708.25835: when evaluation is False, skipping this task 30529 1726882708.25837: _execute() done 30529 1726882708.25840: dumping result to json 30529 1726882708.25842: done dumping result, returning 30529 1726882708.25844: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000288e] 30529 1726882708.25847: sending task result for task 12673a56-9f93-b0f1-edc0-00000000288e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882708.26008: no more pending results, returning what we have 30529 1726882708.26012: results queue empty 30529 1726882708.26013: checking for any_errors_fatal 30529 1726882708.26020: done checking for any_errors_fatal 30529 1726882708.26021: checking for max_fail_percentage 30529 1726882708.26022: done checking for max_fail_percentage 30529 1726882708.26023: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.26024: done checking to see if all hosts have failed 30529 1726882708.26025: getting the remaining hosts for this loop 30529 1726882708.26026: done getting the remaining hosts for this loop 30529 1726882708.26030: getting the next task for host managed_node1 30529 1726882708.26038: done getting next task for host managed_node1 30529 1726882708.26040: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30529 1726882708.26045: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.26049: getting variables 30529 1726882708.26050: in VariableManager get_vars() 30529 1726882708.26097: Calling all_inventory to load vars for managed_node1 30529 1726882708.26099: Calling groups_inventory to load vars for managed_node1 30529 1726882708.26103: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.26117: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.26121: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.26124: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.26646: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000288e 30529 1726882708.26649: WORKER PROCESS EXITING 30529 1726882708.27783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.29416: done with get_vars() 30529 1726882708.29442: done getting variables 30529 1726882708.29511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882708.29631: variable 'profile' from source: play vars 30529 1726882708.29635: variable 'interface' from source: play vars 30529 1726882708.29699: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:38:28 -0400 (0:00:00.052) 0:02:02.323 ****** 30529 1726882708.29735: entering _queue_task() for managed_node1/set_fact 30529 1726882708.30155: worker is 1 (out of 1 available) 30529 1726882708.30168: exiting _queue_task() for managed_node1/set_fact 30529 1726882708.30182: done queuing things up, now waiting for results queue to drain 30529 1726882708.30184: waiting for pending results... 30529 1726882708.30609: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr 30529 1726882708.30634: in run() - task 12673a56-9f93-b0f1-edc0-00000000288f 30529 1726882708.30656: variable 'ansible_search_path' from source: unknown 30529 1726882708.30665: variable 'ansible_search_path' from source: unknown 30529 1726882708.30707: calling self._execute() 30529 1726882708.30810: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.30823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.30843: variable 'omit' from source: magic vars 30529 1726882708.31208: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.31225: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.31352: variable 'profile_stat' from source: set_fact 30529 1726882708.31367: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882708.31377: when evaluation is False, skipping this task 30529 1726882708.31384: _execute() done 30529 1726882708.31391: dumping result to json 30529 1726882708.31488: done dumping result, returning 30529 1726882708.31491: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-00000000288f] 30529 1726882708.31495: sending task result for task 12673a56-9f93-b0f1-edc0-00000000288f 30529 1726882708.31565: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000288f 30529 1726882708.31568: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882708.31640: no more pending results, returning what we have 30529 1726882708.31644: results queue empty 30529 1726882708.31646: checking for any_errors_fatal 30529 1726882708.31652: done checking for any_errors_fatal 30529 1726882708.31653: checking for max_fail_percentage 30529 1726882708.31654: done checking for max_fail_percentage 30529 1726882708.31655: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.31656: done checking to see if all hosts have failed 30529 1726882708.31657: getting the remaining hosts for this loop 30529 1726882708.31659: done getting the remaining hosts for this loop 30529 1726882708.31663: getting the next task for host managed_node1 30529 1726882708.31673: done getting next task for host managed_node1 30529 1726882708.31676: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30529 1726882708.31682: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.31687: getting variables 30529 1726882708.31689: in VariableManager get_vars() 30529 1726882708.31740: Calling all_inventory to load vars for managed_node1 30529 1726882708.31743: Calling groups_inventory to load vars for managed_node1 30529 1726882708.31747: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.31761: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.31765: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.31768: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.33383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.35001: done with get_vars() 30529 1726882708.35019: done getting variables 30529 1726882708.35065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882708.35149: variable 'profile' from source: play vars 30529 1726882708.35152: variable 'interface' from source: play vars 30529 1726882708.35196: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:38:28 -0400 (0:00:00.054) 0:02:02.378 ****** 30529 1726882708.35221: entering _queue_task() for managed_node1/command 30529 1726882708.35487: worker is 1 (out of 1 available) 30529 1726882708.35503: exiting _queue_task() for managed_node1/command 30529 1726882708.35516: done queuing things up, now waiting for results queue to drain 30529 1726882708.35518: waiting for pending results... 30529 1726882708.35709: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr 30529 1726882708.35788: in run() - task 12673a56-9f93-b0f1-edc0-000000002890 30529 1726882708.35806: variable 'ansible_search_path' from source: unknown 30529 1726882708.35810: variable 'ansible_search_path' from source: unknown 30529 1726882708.35835: calling self._execute() 30529 1726882708.35917: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.35920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.35929: variable 'omit' from source: magic vars 30529 1726882708.36207: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.36217: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.36305: variable 'profile_stat' from source: set_fact 30529 1726882708.36313: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882708.36317: when evaluation is False, skipping this task 30529 1726882708.36319: _execute() done 30529 1726882708.36322: dumping result to json 30529 1726882708.36325: done dumping result, returning 30529 1726882708.36332: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000002890] 30529 1726882708.36336: sending task result for task 12673a56-9f93-b0f1-edc0-000000002890 30529 1726882708.36420: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002890 30529 1726882708.36423: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882708.36476: no more pending results, returning what we have 30529 1726882708.36480: results queue empty 30529 1726882708.36481: checking for any_errors_fatal 30529 1726882708.36490: done checking for any_errors_fatal 30529 1726882708.36491: checking for max_fail_percentage 30529 1726882708.36495: done checking for max_fail_percentage 30529 1726882708.36496: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.36496: done checking to see if all hosts have failed 30529 1726882708.36497: getting the remaining hosts for this loop 30529 1726882708.36499: done getting the remaining hosts for this loop 30529 1726882708.36502: getting the next task for host managed_node1 30529 1726882708.36511: done getting next task for host managed_node1 30529 1726882708.36513: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30529 1726882708.36519: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.36523: getting variables 30529 1726882708.36525: in VariableManager get_vars() 30529 1726882708.36606: Calling all_inventory to load vars for managed_node1 30529 1726882708.36609: Calling groups_inventory to load vars for managed_node1 30529 1726882708.36612: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.36623: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.36626: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.36628: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.37932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.38796: done with get_vars() 30529 1726882708.38811: done getting variables 30529 1726882708.38855: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882708.38933: variable 'profile' from source: play vars 30529 1726882708.38935: variable 'interface' from source: play vars 30529 1726882708.38972: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:38:28 -0400 (0:00:00.037) 0:02:02.416 ****** 30529 1726882708.38999: entering _queue_task() for managed_node1/set_fact 30529 1726882708.39233: worker is 1 (out of 1 available) 30529 1726882708.39246: exiting _queue_task() for managed_node1/set_fact 30529 1726882708.39259: done queuing things up, now waiting for results queue to drain 30529 1726882708.39261: waiting for pending results... 30529 1726882708.39444: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr 30529 1726882708.39526: in run() - task 12673a56-9f93-b0f1-edc0-000000002891 30529 1726882708.39538: variable 'ansible_search_path' from source: unknown 30529 1726882708.39542: variable 'ansible_search_path' from source: unknown 30529 1726882708.39568: calling self._execute() 30529 1726882708.39644: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.39647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.39656: variable 'omit' from source: magic vars 30529 1726882708.39996: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.40000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.40161: variable 'profile_stat' from source: set_fact 30529 1726882708.40165: Evaluated conditional (profile_stat.stat.exists): False 30529 1726882708.40167: when evaluation is False, skipping this task 30529 1726882708.40169: _execute() done 30529 1726882708.40172: dumping result to json 30529 1726882708.40174: done dumping result, returning 30529 1726882708.40177: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-statebr [12673a56-9f93-b0f1-edc0-000000002891] 30529 1726882708.40179: sending task result for task 12673a56-9f93-b0f1-edc0-000000002891 30529 1726882708.40243: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002891 30529 1726882708.40246: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30529 1726882708.40288: no more pending results, returning what we have 30529 1726882708.40291: results queue empty 30529 1726882708.40292: checking for any_errors_fatal 30529 1726882708.40301: done checking for any_errors_fatal 30529 1726882708.40302: checking for max_fail_percentage 30529 1726882708.40303: done checking for max_fail_percentage 30529 1726882708.40304: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.40305: done checking to see if all hosts have failed 30529 1726882708.40306: getting the remaining hosts for this loop 30529 1726882708.40307: done getting the remaining hosts for this loop 30529 1726882708.40310: getting the next task for host managed_node1 30529 1726882708.40319: done getting next task for host managed_node1 30529 1726882708.40322: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30529 1726882708.40325: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.40329: getting variables 30529 1726882708.40331: in VariableManager get_vars() 30529 1726882708.40367: Calling all_inventory to load vars for managed_node1 30529 1726882708.40369: Calling groups_inventory to load vars for managed_node1 30529 1726882708.40372: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.40383: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.40386: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.40388: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.41815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.42651: done with get_vars() 30529 1726882708.42667: done getting variables 30529 1726882708.42711: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882708.42788: variable 'profile' from source: play vars 30529 1726882708.42792: variable 'interface' from source: play vars 30529 1726882708.42832: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:38:28 -0400 (0:00:00.038) 0:02:02.454 ****** 30529 1726882708.42857: entering _queue_task() for managed_node1/assert 30529 1726882708.43096: worker is 1 (out of 1 available) 30529 1726882708.43113: exiting _queue_task() for managed_node1/assert 30529 1726882708.43129: done queuing things up, now waiting for results queue to drain 30529 1726882708.43131: waiting for pending results... 30529 1726882708.43404: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' 30529 1726882708.43485: in run() - task 12673a56-9f93-b0f1-edc0-000000002805 30529 1726882708.43699: variable 'ansible_search_path' from source: unknown 30529 1726882708.43702: variable 'ansible_search_path' from source: unknown 30529 1726882708.43705: calling self._execute() 30529 1726882708.43707: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.43710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.43712: variable 'omit' from source: magic vars 30529 1726882708.44027: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.44048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.44054: variable 'omit' from source: magic vars 30529 1726882708.44098: variable 'omit' from source: magic vars 30529 1726882708.44298: variable 'profile' from source: play vars 30529 1726882708.44301: variable 'interface' from source: play vars 30529 1726882708.44303: variable 'interface' from source: play vars 30529 1726882708.44306: variable 'omit' from source: magic vars 30529 1726882708.44321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882708.44361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882708.44385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882708.44413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.44432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.44479: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882708.44489: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.44505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.44616: Set connection var ansible_shell_executable to /bin/sh 30529 1726882708.44630: Set connection var ansible_pipelining to False 30529 1726882708.44638: Set connection var ansible_shell_type to sh 30529 1726882708.44652: Set connection var ansible_timeout to 10 30529 1726882708.44657: Set connection var ansible_connection to ssh 30529 1726882708.44666: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882708.44692: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.44742: variable 'ansible_connection' from source: unknown 30529 1726882708.44745: variable 'ansible_module_compression' from source: unknown 30529 1726882708.44748: variable 'ansible_shell_type' from source: unknown 30529 1726882708.44751: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.44753: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.44755: variable 'ansible_pipelining' from source: unknown 30529 1726882708.44758: variable 'ansible_timeout' from source: unknown 30529 1726882708.44761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.44878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882708.44898: variable 'omit' from source: magic vars 30529 1726882708.44960: starting attempt loop 30529 1726882708.44964: running the handler 30529 1726882708.45110: variable 'lsr_net_profile_exists' from source: set_fact 30529 1726882708.45119: Evaluated conditional (not lsr_net_profile_exists): True 30529 1726882708.45154: handler run complete 30529 1726882708.45157: attempt loop complete, returning result 30529 1726882708.45160: _execute() done 30529 1726882708.45162: dumping result to json 30529 1726882708.45165: done dumping result, returning 30529 1726882708.45167: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'statebr' [12673a56-9f93-b0f1-edc0-000000002805] 30529 1726882708.45169: sending task result for task 12673a56-9f93-b0f1-edc0-000000002805 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882708.45302: no more pending results, returning what we have 30529 1726882708.45305: results queue empty 30529 1726882708.45306: checking for any_errors_fatal 30529 1726882708.45314: done checking for any_errors_fatal 30529 1726882708.45315: checking for max_fail_percentage 30529 1726882708.45317: done checking for max_fail_percentage 30529 1726882708.45318: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.45319: done checking to see if all hosts have failed 30529 1726882708.45320: getting the remaining hosts for this loop 30529 1726882708.45321: done getting the remaining hosts for this loop 30529 1726882708.45325: getting the next task for host managed_node1 30529 1726882708.45337: done getting next task for host managed_node1 30529 1726882708.45340: ^ task is: TASK: Get NetworkManager RPM version 30529 1726882708.45344: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.45347: getting variables 30529 1726882708.45349: in VariableManager get_vars() 30529 1726882708.45391: Calling all_inventory to load vars for managed_node1 30529 1726882708.45395: Calling groups_inventory to load vars for managed_node1 30529 1726882708.45399: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.45409: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.45412: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.45414: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.46007: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002805 30529 1726882708.46011: WORKER PROCESS EXITING 30529 1726882708.46242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.47339: done with get_vars() 30529 1726882708.47361: done getting variables 30529 1726882708.47419: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 21:38:28 -0400 (0:00:00.045) 0:02:02.500 ****** 30529 1726882708.47454: entering _queue_task() for managed_node1/command 30529 1726882708.47765: worker is 1 (out of 1 available) 30529 1726882708.47779: exiting _queue_task() for managed_node1/command 30529 1726882708.47796: done queuing things up, now waiting for results queue to drain 30529 1726882708.47798: waiting for pending results... 30529 1726882708.47991: running TaskExecutor() for managed_node1/TASK: Get NetworkManager RPM version 30529 1726882708.48076: in run() - task 12673a56-9f93-b0f1-edc0-000000002809 30529 1726882708.48087: variable 'ansible_search_path' from source: unknown 30529 1726882708.48091: variable 'ansible_search_path' from source: unknown 30529 1726882708.48123: calling self._execute() 30529 1726882708.48200: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.48204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.48212: variable 'omit' from source: magic vars 30529 1726882708.48482: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.48497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.48506: variable 'omit' from source: magic vars 30529 1726882708.48540: variable 'omit' from source: magic vars 30529 1726882708.48565: variable 'omit' from source: magic vars 30529 1726882708.48598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882708.48626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882708.48642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882708.48654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.48665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.48689: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882708.48696: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.48699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.48768: Set connection var ansible_shell_executable to /bin/sh 30529 1726882708.48772: Set connection var ansible_pipelining to False 30529 1726882708.48774: Set connection var ansible_shell_type to sh 30529 1726882708.48783: Set connection var ansible_timeout to 10 30529 1726882708.48786: Set connection var ansible_connection to ssh 30529 1726882708.48795: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882708.48811: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.48814: variable 'ansible_connection' from source: unknown 30529 1726882708.48817: variable 'ansible_module_compression' from source: unknown 30529 1726882708.48820: variable 'ansible_shell_type' from source: unknown 30529 1726882708.48822: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.48826: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.48829: variable 'ansible_pipelining' from source: unknown 30529 1726882708.48831: variable 'ansible_timeout' from source: unknown 30529 1726882708.48837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.48934: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882708.48946: variable 'omit' from source: magic vars 30529 1726882708.48949: starting attempt loop 30529 1726882708.48952: running the handler 30529 1726882708.48964: _low_level_execute_command(): starting 30529 1726882708.48971: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882708.49458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.49461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.49465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.49519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882708.49523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.49573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.51240: stdout chunk (state=3): >>>/root <<< 30529 1726882708.51340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.51365: stderr chunk (state=3): >>><<< 30529 1726882708.51368: stdout chunk (state=3): >>><<< 30529 1726882708.51387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882708.51404: _low_level_execute_command(): starting 30529 1726882708.51411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156 `" && echo ansible-tmp-1726882708.5138628-35855-106974940842156="` echo /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156 `" ) && sleep 0' 30529 1726882708.51822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.51825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.51830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.51838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.51881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882708.51884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.51933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.53800: stdout chunk (state=3): >>>ansible-tmp-1726882708.5138628-35855-106974940842156=/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156 <<< 30529 1726882708.53905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.53927: stderr chunk (state=3): >>><<< 30529 1726882708.53930: stdout chunk (state=3): >>><<< 30529 1726882708.53943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882708.5138628-35855-106974940842156=/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882708.53967: variable 'ansible_module_compression' from source: unknown 30529 1726882708.54012: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882708.54042: variable 'ansible_facts' from source: unknown 30529 1726882708.54099: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py 30529 1726882708.54187: Sending initial data 30529 1726882708.54195: Sent initial data (156 bytes) 30529 1726882708.54617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.54621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882708.54623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.54625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882708.54627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882708.54629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.54674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882708.54677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.54723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.56264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882708.56271: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882708.56306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882708.56346: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpr3c529rm /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py <<< 30529 1726882708.56354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py" <<< 30529 1726882708.56386: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpr3c529rm" to remote "/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py" <<< 30529 1726882708.56912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.56946: stderr chunk (state=3): >>><<< 30529 1726882708.56949: stdout chunk (state=3): >>><<< 30529 1726882708.56978: done transferring module to remote 30529 1726882708.56986: _low_level_execute_command(): starting 30529 1726882708.56989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/ /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py && sleep 0' 30529 1726882708.57375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.57379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.57392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.57446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882708.57454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.57496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.59274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.59298: stderr chunk (state=3): >>><<< 30529 1726882708.59301: stdout chunk (state=3): >>><<< 30529 1726882708.59313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882708.59316: _low_level_execute_command(): starting 30529 1726882708.59319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/AnsiballZ_command.py && sleep 0' 30529 1726882708.59697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882708.59700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.59714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882708.59730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.59770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882708.59773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.59824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.91117: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:38:28.747895", "end": "2024-09-20 21:38:28.910031", "delta": "0:00:00.162136", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882708.92773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882708.92777: stdout chunk (state=3): >>><<< 30529 1726882708.92967: stderr chunk (state=3): >>><<< 30529 1726882708.92971: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:38:28.747895", "end": "2024-09-20 21:38:28.910031", "delta": "0:00:00.162136", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882708.92975: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882708.92978: _low_level_execute_command(): starting 30529 1726882708.92980: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882708.5138628-35855-106974940842156/ > /dev/null 2>&1 && sleep 0' 30529 1726882708.93483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882708.93502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.93514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882708.93555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882708.93568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882708.93578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882708.93624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882708.95420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882708.95441: stderr chunk (state=3): >>><<< 30529 1726882708.95444: stdout chunk (state=3): >>><<< 30529 1726882708.95455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882708.95460: handler run complete 30529 1726882708.95478: Evaluated conditional (False): False 30529 1726882708.95489: attempt loop complete, returning result 30529 1726882708.95495: _execute() done 30529 1726882708.95498: dumping result to json 30529 1726882708.95503: done dumping result, returning 30529 1726882708.95512: done running TaskExecutor() for managed_node1/TASK: Get NetworkManager RPM version [12673a56-9f93-b0f1-edc0-000000002809] 30529 1726882708.95514: sending task result for task 12673a56-9f93-b0f1-edc0-000000002809 30529 1726882708.95613: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002809 30529 1726882708.95617: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.162136", "end": "2024-09-20 21:38:28.910031", "rc": 0, "start": "2024-09-20 21:38:28.747895" } STDOUT: NetworkManager-1.48.10-1.el10 30529 1726882708.95682: no more pending results, returning what we have 30529 1726882708.95685: results queue empty 30529 1726882708.95687: checking for any_errors_fatal 30529 1726882708.95691: done checking for any_errors_fatal 30529 1726882708.95692: checking for max_fail_percentage 30529 1726882708.95699: done checking for max_fail_percentage 30529 1726882708.95700: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.95701: done checking to see if all hosts have failed 30529 1726882708.95702: getting the remaining hosts for this loop 30529 1726882708.95704: done getting the remaining hosts for this loop 30529 1726882708.95708: getting the next task for host managed_node1 30529 1726882708.95715: done getting next task for host managed_node1 30529 1726882708.95719: ^ task is: TASK: Store NetworkManager version 30529 1726882708.95722: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.95727: getting variables 30529 1726882708.95728: in VariableManager get_vars() 30529 1726882708.95772: Calling all_inventory to load vars for managed_node1 30529 1726882708.95774: Calling groups_inventory to load vars for managed_node1 30529 1726882708.95777: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.95787: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.95790: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.95792: Calling groups_plugins_play to load vars for managed_node1 30529 1726882708.96765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882708.97606: done with get_vars() 30529 1726882708.97623: done getting variables 30529 1726882708.97666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 21:38:28 -0400 (0:00:00.502) 0:02:03.002 ****** 30529 1726882708.97688: entering _queue_task() for managed_node1/set_fact 30529 1726882708.97916: worker is 1 (out of 1 available) 30529 1726882708.97928: exiting _queue_task() for managed_node1/set_fact 30529 1726882708.97941: done queuing things up, now waiting for results queue to drain 30529 1726882708.97943: waiting for pending results... 30529 1726882708.98130: running TaskExecutor() for managed_node1/TASK: Store NetworkManager version 30529 1726882708.98209: in run() - task 12673a56-9f93-b0f1-edc0-00000000280a 30529 1726882708.98221: variable 'ansible_search_path' from source: unknown 30529 1726882708.98224: variable 'ansible_search_path' from source: unknown 30529 1726882708.98251: calling self._execute() 30529 1726882708.98327: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.98331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.98339: variable 'omit' from source: magic vars 30529 1726882708.98619: variable 'ansible_distribution_major_version' from source: facts 30529 1726882708.98630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882708.98635: variable 'omit' from source: magic vars 30529 1726882708.98667: variable 'omit' from source: magic vars 30529 1726882708.98745: variable '__rpm_q_networkmanager' from source: set_fact 30529 1726882708.98762: variable 'omit' from source: magic vars 30529 1726882708.98798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882708.98826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882708.98845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882708.98857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.98868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882708.98892: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882708.98900: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.98902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.98974: Set connection var ansible_shell_executable to /bin/sh 30529 1726882708.98977: Set connection var ansible_pipelining to False 30529 1726882708.98980: Set connection var ansible_shell_type to sh 30529 1726882708.98988: Set connection var ansible_timeout to 10 30529 1726882708.98991: Set connection var ansible_connection to ssh 30529 1726882708.98999: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882708.99015: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.99019: variable 'ansible_connection' from source: unknown 30529 1726882708.99021: variable 'ansible_module_compression' from source: unknown 30529 1726882708.99023: variable 'ansible_shell_type' from source: unknown 30529 1726882708.99025: variable 'ansible_shell_executable' from source: unknown 30529 1726882708.99029: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882708.99032: variable 'ansible_pipelining' from source: unknown 30529 1726882708.99034: variable 'ansible_timeout' from source: unknown 30529 1726882708.99036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882708.99136: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882708.99147: variable 'omit' from source: magic vars 30529 1726882708.99150: starting attempt loop 30529 1726882708.99153: running the handler 30529 1726882708.99209: handler run complete 30529 1726882708.99213: attempt loop complete, returning result 30529 1726882708.99215: _execute() done 30529 1726882708.99219: dumping result to json 30529 1726882708.99221: done dumping result, returning 30529 1726882708.99223: done running TaskExecutor() for managed_node1/TASK: Store NetworkManager version [12673a56-9f93-b0f1-edc0-00000000280a] 30529 1726882708.99225: sending task result for task 12673a56-9f93-b0f1-edc0-00000000280a 30529 1726882708.99288: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000280a 30529 1726882708.99291: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" }, "changed": false } 30529 1726882708.99340: no more pending results, returning what we have 30529 1726882708.99343: results queue empty 30529 1726882708.99344: checking for any_errors_fatal 30529 1726882708.99352: done checking for any_errors_fatal 30529 1726882708.99353: checking for max_fail_percentage 30529 1726882708.99355: done checking for max_fail_percentage 30529 1726882708.99355: checking to see if all hosts have failed and the running result is not ok 30529 1726882708.99356: done checking to see if all hosts have failed 30529 1726882708.99357: getting the remaining hosts for this loop 30529 1726882708.99359: done getting the remaining hosts for this loop 30529 1726882708.99362: getting the next task for host managed_node1 30529 1726882708.99369: done getting next task for host managed_node1 30529 1726882708.99372: ^ task is: TASK: Show NetworkManager version 30529 1726882708.99375: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882708.99378: getting variables 30529 1726882708.99380: in VariableManager get_vars() 30529 1726882708.99417: Calling all_inventory to load vars for managed_node1 30529 1726882708.99420: Calling groups_inventory to load vars for managed_node1 30529 1726882708.99423: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882708.99432: Calling all_plugins_play to load vars for managed_node1 30529 1726882708.99435: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882708.99437: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.00215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.01657: done with get_vars() 30529 1726882709.01673: done getting variables 30529 1726882709.01718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 21:38:29 -0400 (0:00:00.040) 0:02:03.043 ****** 30529 1726882709.01740: entering _queue_task() for managed_node1/debug 30529 1726882709.01959: worker is 1 (out of 1 available) 30529 1726882709.01972: exiting _queue_task() for managed_node1/debug 30529 1726882709.01985: done queuing things up, now waiting for results queue to drain 30529 1726882709.01987: waiting for pending results... 30529 1726882709.02173: running TaskExecutor() for managed_node1/TASK: Show NetworkManager version 30529 1726882709.02259: in run() - task 12673a56-9f93-b0f1-edc0-00000000280b 30529 1726882709.02270: variable 'ansible_search_path' from source: unknown 30529 1726882709.02273: variable 'ansible_search_path' from source: unknown 30529 1726882709.02303: calling self._execute() 30529 1726882709.02375: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.02379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.02387: variable 'omit' from source: magic vars 30529 1726882709.02669: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.02679: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.02686: variable 'omit' from source: magic vars 30529 1726882709.02722: variable 'omit' from source: magic vars 30529 1726882709.02744: variable 'omit' from source: magic vars 30529 1726882709.02778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882709.02808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882709.02825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882709.02838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.02848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.02873: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882709.02877: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.02879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.02949: Set connection var ansible_shell_executable to /bin/sh 30529 1726882709.02952: Set connection var ansible_pipelining to False 30529 1726882709.02955: Set connection var ansible_shell_type to sh 30529 1726882709.02963: Set connection var ansible_timeout to 10 30529 1726882709.02965: Set connection var ansible_connection to ssh 30529 1726882709.02971: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882709.02988: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.02996: variable 'ansible_connection' from source: unknown 30529 1726882709.02999: variable 'ansible_module_compression' from source: unknown 30529 1726882709.03001: variable 'ansible_shell_type' from source: unknown 30529 1726882709.03003: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.03005: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.03007: variable 'ansible_pipelining' from source: unknown 30529 1726882709.03009: variable 'ansible_timeout' from source: unknown 30529 1726882709.03011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.03110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882709.03119: variable 'omit' from source: magic vars 30529 1726882709.03124: starting attempt loop 30529 1726882709.03127: running the handler 30529 1726882709.03164: variable 'networkmanager_nvr' from source: set_fact 30529 1726882709.03221: variable 'networkmanager_nvr' from source: set_fact 30529 1726882709.03230: handler run complete 30529 1726882709.03242: attempt loop complete, returning result 30529 1726882709.03245: _execute() done 30529 1726882709.03248: dumping result to json 30529 1726882709.03250: done dumping result, returning 30529 1726882709.03256: done running TaskExecutor() for managed_node1/TASK: Show NetworkManager version [12673a56-9f93-b0f1-edc0-00000000280b] 30529 1726882709.03260: sending task result for task 12673a56-9f93-b0f1-edc0-00000000280b 30529 1726882709.03464: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000280b 30529 1726882709.03468: WORKER PROCESS EXITING ok: [managed_node1] => { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" } 30529 1726882709.03547: no more pending results, returning what we have 30529 1726882709.03551: results queue empty 30529 1726882709.03552: checking for any_errors_fatal 30529 1726882709.03558: done checking for any_errors_fatal 30529 1726882709.03559: checking for max_fail_percentage 30529 1726882709.03560: done checking for max_fail_percentage 30529 1726882709.03561: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.03562: done checking to see if all hosts have failed 30529 1726882709.03562: getting the remaining hosts for this loop 30529 1726882709.03564: done getting the remaining hosts for this loop 30529 1726882709.03568: getting the next task for host managed_node1 30529 1726882709.03695: done getting next task for host managed_node1 30529 1726882709.03698: ^ task is: TASK: Conditional asserts 30529 1726882709.03700: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.03704: getting variables 30529 1726882709.03705: in VariableManager get_vars() 30529 1726882709.03739: Calling all_inventory to load vars for managed_node1 30529 1726882709.03741: Calling groups_inventory to load vars for managed_node1 30529 1726882709.03744: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.03753: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.03756: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.03758: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.05355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.06905: done with get_vars() 30529 1726882709.06928: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:38:29 -0400 (0:00:00.052) 0:02:03.096 ****** 30529 1726882709.07023: entering _queue_task() for managed_node1/include_tasks 30529 1726882709.07527: worker is 1 (out of 1 available) 30529 1726882709.07537: exiting _queue_task() for managed_node1/include_tasks 30529 1726882709.07548: done queuing things up, now waiting for results queue to drain 30529 1726882709.07550: waiting for pending results... 30529 1726882709.07791: running TaskExecutor() for managed_node1/TASK: Conditional asserts 30529 1726882709.07799: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b3 30529 1726882709.07802: variable 'ansible_search_path' from source: unknown 30529 1726882709.07811: variable 'ansible_search_path' from source: unknown 30529 1726882709.08096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30529 1726882709.10375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30529 1726882709.10460: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30529 1726882709.10514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30529 1726882709.10555: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30529 1726882709.10587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30529 1726882709.10703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30529 1726882709.10746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30529 1726882709.10825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30529 1726882709.10833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30529 1726882709.10853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30529 1726882709.10975: variable 'lsr_assert_when' from source: include params 30529 1726882709.11103: variable 'network_provider' from source: set_fact 30529 1726882709.11184: variable 'omit' from source: magic vars 30529 1726882709.11323: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.11336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.11349: variable 'omit' from source: magic vars 30529 1726882709.11557: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.11564: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.11647: variable 'item' from source: unknown 30529 1726882709.11658: Evaluated conditional (item['condition']): True 30529 1726882709.11716: variable 'item' from source: unknown 30529 1726882709.11741: variable 'item' from source: unknown 30529 1726882709.11785: variable 'item' from source: unknown 30529 1726882709.11930: dumping result to json 30529 1726882709.11933: done dumping result, returning 30529 1726882709.11935: done running TaskExecutor() for managed_node1/TASK: Conditional asserts [12673a56-9f93-b0f1-edc0-0000000020b3] 30529 1726882709.11937: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b3 30529 1726882709.11971: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b3 30529 1726882709.11973: WORKER PROCESS EXITING 30529 1726882709.11999: no more pending results, returning what we have 30529 1726882709.12004: in VariableManager get_vars() 30529 1726882709.12052: Calling all_inventory to load vars for managed_node1 30529 1726882709.12055: Calling groups_inventory to load vars for managed_node1 30529 1726882709.12058: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.12069: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.12072: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.12074: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.12922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.14341: done with get_vars() 30529 1726882709.14355: variable 'ansible_search_path' from source: unknown 30529 1726882709.14356: variable 'ansible_search_path' from source: unknown 30529 1726882709.14388: we have included files to process 30529 1726882709.14390: generating all_blocks data 30529 1726882709.14391: done generating all_blocks data 30529 1726882709.14397: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882709.14398: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882709.14399: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30529 1726882709.14470: in VariableManager get_vars() 30529 1726882709.14485: done with get_vars() 30529 1726882709.14561: done processing included file 30529 1726882709.14562: iterating over new_blocks loaded from include file 30529 1726882709.14563: in VariableManager get_vars() 30529 1726882709.14574: done with get_vars() 30529 1726882709.14575: filtering new block on tags 30529 1726882709.14599: done filtering new block on tags 30529 1726882709.14602: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30529 1726882709.14606: extending task lists for all hosts with included blocks 30529 1726882709.15348: done extending task lists 30529 1726882709.15349: done processing included files 30529 1726882709.15349: results queue empty 30529 1726882709.15350: checking for any_errors_fatal 30529 1726882709.15352: done checking for any_errors_fatal 30529 1726882709.15353: checking for max_fail_percentage 30529 1726882709.15353: done checking for max_fail_percentage 30529 1726882709.15354: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.15355: done checking to see if all hosts have failed 30529 1726882709.15355: getting the remaining hosts for this loop 30529 1726882709.15356: done getting the remaining hosts for this loop 30529 1726882709.15359: getting the next task for host managed_node1 30529 1726882709.15362: done getting next task for host managed_node1 30529 1726882709.15363: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30529 1726882709.15365: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.15371: getting variables 30529 1726882709.15372: in VariableManager get_vars() 30529 1726882709.15380: Calling all_inventory to load vars for managed_node1 30529 1726882709.15381: Calling groups_inventory to load vars for managed_node1 30529 1726882709.15383: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.15387: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.15388: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.15390: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.16043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.17453: done with get_vars() 30529 1726882709.17475: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:38:29 -0400 (0:00:00.105) 0:02:03.201 ****** 30529 1726882709.17553: entering _queue_task() for managed_node1/include_tasks 30529 1726882709.17920: worker is 1 (out of 1 available) 30529 1726882709.17933: exiting _queue_task() for managed_node1/include_tasks 30529 1726882709.17948: done queuing things up, now waiting for results queue to drain 30529 1726882709.17949: waiting for pending results... 30529 1726882709.18522: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 30529 1726882709.18527: in run() - task 12673a56-9f93-b0f1-edc0-0000000028d3 30529 1726882709.18529: variable 'ansible_search_path' from source: unknown 30529 1726882709.18531: variable 'ansible_search_path' from source: unknown 30529 1726882709.18534: calling self._execute() 30529 1726882709.18536: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.18545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.18558: variable 'omit' from source: magic vars 30529 1726882709.18941: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.18961: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.18972: _execute() done 30529 1726882709.19057: dumping result to json 30529 1726882709.19061: done dumping result, returning 30529 1726882709.19064: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-b0f1-edc0-0000000028d3] 30529 1726882709.19066: sending task result for task 12673a56-9f93-b0f1-edc0-0000000028d3 30529 1726882709.19139: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000028d3 30529 1726882709.19143: WORKER PROCESS EXITING 30529 1726882709.19187: no more pending results, returning what we have 30529 1726882709.19194: in VariableManager get_vars() 30529 1726882709.19250: Calling all_inventory to load vars for managed_node1 30529 1726882709.19253: Calling groups_inventory to load vars for managed_node1 30529 1726882709.19256: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.19271: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.19275: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.19279: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.20956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.22490: done with get_vars() 30529 1726882709.22511: variable 'ansible_search_path' from source: unknown 30529 1726882709.22513: variable 'ansible_search_path' from source: unknown 30529 1726882709.22636: variable 'item' from source: include params 30529 1726882709.22670: we have included files to process 30529 1726882709.22671: generating all_blocks data 30529 1726882709.22673: done generating all_blocks data 30529 1726882709.22674: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882709.22675: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882709.22677: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30529 1726882709.22847: done processing included file 30529 1726882709.22849: iterating over new_blocks loaded from include file 30529 1726882709.22850: in VariableManager get_vars() 30529 1726882709.22868: done with get_vars() 30529 1726882709.22869: filtering new block on tags 30529 1726882709.22894: done filtering new block on tags 30529 1726882709.22897: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 30529 1726882709.22902: extending task lists for all hosts with included blocks 30529 1726882709.23054: done extending task lists 30529 1726882709.23055: done processing included files 30529 1726882709.23056: results queue empty 30529 1726882709.23057: checking for any_errors_fatal 30529 1726882709.23060: done checking for any_errors_fatal 30529 1726882709.23061: checking for max_fail_percentage 30529 1726882709.23062: done checking for max_fail_percentage 30529 1726882709.23062: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.23063: done checking to see if all hosts have failed 30529 1726882709.23064: getting the remaining hosts for this loop 30529 1726882709.23065: done getting the remaining hosts for this loop 30529 1726882709.23068: getting the next task for host managed_node1 30529 1726882709.23072: done getting next task for host managed_node1 30529 1726882709.23074: ^ task is: TASK: Get stat for interface {{ interface }} 30529 1726882709.23077: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.23079: getting variables 30529 1726882709.23080: in VariableManager get_vars() 30529 1726882709.23091: Calling all_inventory to load vars for managed_node1 30529 1726882709.23095: Calling groups_inventory to load vars for managed_node1 30529 1726882709.23097: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.23102: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.23105: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.23107: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.24264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.25762: done with get_vars() 30529 1726882709.25785: done getting variables 30529 1726882709.25909: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:29 -0400 (0:00:00.083) 0:02:03.285 ****** 30529 1726882709.25940: entering _queue_task() for managed_node1/stat 30529 1726882709.26308: worker is 1 (out of 1 available) 30529 1726882709.26322: exiting _queue_task() for managed_node1/stat 30529 1726882709.26335: done queuing things up, now waiting for results queue to drain 30529 1726882709.26337: waiting for pending results... 30529 1726882709.26723: running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr 30529 1726882709.26769: in run() - task 12673a56-9f93-b0f1-edc0-000000002979 30529 1726882709.26789: variable 'ansible_search_path' from source: unknown 30529 1726882709.26799: variable 'ansible_search_path' from source: unknown 30529 1726882709.26842: calling self._execute() 30529 1726882709.26948: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.26960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.26973: variable 'omit' from source: magic vars 30529 1726882709.27338: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.27355: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.27399: variable 'omit' from source: magic vars 30529 1726882709.27433: variable 'omit' from source: magic vars 30529 1726882709.27598: variable 'interface' from source: play vars 30529 1726882709.27601: variable 'omit' from source: magic vars 30529 1726882709.27603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882709.27642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882709.27667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882709.27689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.27709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.27747: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882709.27756: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.27763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.27871: Set connection var ansible_shell_executable to /bin/sh 30529 1726882709.27882: Set connection var ansible_pipelining to False 30529 1726882709.27890: Set connection var ansible_shell_type to sh 30529 1726882709.27908: Set connection var ansible_timeout to 10 30529 1726882709.27915: Set connection var ansible_connection to ssh 30529 1726882709.27925: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882709.27998: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.28002: variable 'ansible_connection' from source: unknown 30529 1726882709.28004: variable 'ansible_module_compression' from source: unknown 30529 1726882709.28006: variable 'ansible_shell_type' from source: unknown 30529 1726882709.28008: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.28010: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.28011: variable 'ansible_pipelining' from source: unknown 30529 1726882709.28013: variable 'ansible_timeout' from source: unknown 30529 1726882709.28015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.28196: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 30529 1726882709.28213: variable 'omit' from source: magic vars 30529 1726882709.28224: starting attempt loop 30529 1726882709.28273: running the handler 30529 1726882709.28277: _low_level_execute_command(): starting 30529 1726882709.28280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882709.28980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.28999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.29015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.29035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882709.29053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882709.29079: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.29171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.29206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.29283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.30924: stdout chunk (state=3): >>>/root <<< 30529 1726882709.31085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.31088: stdout chunk (state=3): >>><<< 30529 1726882709.31090: stderr chunk (state=3): >>><<< 30529 1726882709.31114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.31206: _low_level_execute_command(): starting 30529 1726882709.31210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411 `" && echo ansible-tmp-1726882709.311211-35882-186671573799411="` echo /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411 `" ) && sleep 0' 30529 1726882709.31738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.31754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.31772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.31861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.31900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882709.31917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.31940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.32021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.33863: stdout chunk (state=3): >>>ansible-tmp-1726882709.311211-35882-186671573799411=/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411 <<< 30529 1726882709.34020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.34025: stdout chunk (state=3): >>><<< 30529 1726882709.34027: stderr chunk (state=3): >>><<< 30529 1726882709.34199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882709.311211-35882-186671573799411=/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.34203: variable 'ansible_module_compression' from source: unknown 30529 1726882709.34205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30529 1726882709.34207: variable 'ansible_facts' from source: unknown 30529 1726882709.34275: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py 30529 1726882709.34447: Sending initial data 30529 1726882709.34457: Sent initial data (152 bytes) 30529 1726882709.35028: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.35044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.35060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.35096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.35114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30529 1726882709.35206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.35230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.35308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.36811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882709.36872: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882709.36918: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpzqg5giws /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py <<< 30529 1726882709.36921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py" <<< 30529 1726882709.36987: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpzqg5giws" to remote "/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py" <<< 30529 1726882709.37705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.37732: stderr chunk (state=3): >>><<< 30529 1726882709.37743: stdout chunk (state=3): >>><<< 30529 1726882709.37798: done transferring module to remote 30529 1726882709.37801: _low_level_execute_command(): starting 30529 1726882709.37804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/ /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py && sleep 0' 30529 1726882709.38345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.38455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.38476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.38551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.40287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.40305: stdout chunk (state=3): >>><<< 30529 1726882709.40325: stderr chunk (state=3): >>><<< 30529 1726882709.40399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.40404: _low_level_execute_command(): starting 30529 1726882709.40406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/AnsiballZ_stat.py && sleep 0' 30529 1726882709.40932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.40948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.40963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.40979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882709.41060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 30529 1726882709.41063: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.41110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.41131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.41204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.56171: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30529 1726882709.57403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882709.57431: stderr chunk (state=3): >>><<< 30529 1726882709.57435: stdout chunk (state=3): >>><<< 30529 1726882709.57451: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882709.57474: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882709.57483: _low_level_execute_command(): starting 30529 1726882709.57488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882709.311211-35882-186671573799411/ > /dev/null 2>&1 && sleep 0' 30529 1726882709.57950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.57953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882709.57955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.57957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.57959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.58008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882709.58013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.58064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.59855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.59877: stderr chunk (state=3): >>><<< 30529 1726882709.59881: stdout chunk (state=3): >>><<< 30529 1726882709.59899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.59903: handler run complete 30529 1726882709.59919: attempt loop complete, returning result 30529 1726882709.59921: _execute() done 30529 1726882709.59924: dumping result to json 30529 1726882709.59926: done dumping result, returning 30529 1726882709.59935: done running TaskExecutor() for managed_node1/TASK: Get stat for interface statebr [12673a56-9f93-b0f1-edc0-000000002979] 30529 1726882709.59940: sending task result for task 12673a56-9f93-b0f1-edc0-000000002979 30529 1726882709.60034: done sending task result for task 12673a56-9f93-b0f1-edc0-000000002979 30529 1726882709.60037: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 30529 1726882709.60095: no more pending results, returning what we have 30529 1726882709.60099: results queue empty 30529 1726882709.60100: checking for any_errors_fatal 30529 1726882709.60101: done checking for any_errors_fatal 30529 1726882709.60102: checking for max_fail_percentage 30529 1726882709.60104: done checking for max_fail_percentage 30529 1726882709.60105: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.60105: done checking to see if all hosts have failed 30529 1726882709.60106: getting the remaining hosts for this loop 30529 1726882709.60108: done getting the remaining hosts for this loop 30529 1726882709.60111: getting the next task for host managed_node1 30529 1726882709.60122: done getting next task for host managed_node1 30529 1726882709.60124: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30529 1726882709.60128: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.60134: getting variables 30529 1726882709.60136: in VariableManager get_vars() 30529 1726882709.60181: Calling all_inventory to load vars for managed_node1 30529 1726882709.60184: Calling groups_inventory to load vars for managed_node1 30529 1726882709.60187: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.60205: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.60208: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.60211: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.65313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.66147: done with get_vars() 30529 1726882709.66164: done getting variables 30529 1726882709.66204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882709.66271: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:38:29 -0400 (0:00:00.403) 0:02:03.689 ****** 30529 1726882709.66296: entering _queue_task() for managed_node1/assert 30529 1726882709.66576: worker is 1 (out of 1 available) 30529 1726882709.66595: exiting _queue_task() for managed_node1/assert 30529 1726882709.66608: done queuing things up, now waiting for results queue to drain 30529 1726882709.66610: waiting for pending results... 30529 1726882709.66784: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' 30529 1726882709.66873: in run() - task 12673a56-9f93-b0f1-edc0-0000000028d4 30529 1726882709.66883: variable 'ansible_search_path' from source: unknown 30529 1726882709.66886: variable 'ansible_search_path' from source: unknown 30529 1726882709.66919: calling self._execute() 30529 1726882709.66999: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.67003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.67012: variable 'omit' from source: magic vars 30529 1726882709.67292: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.67303: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.67309: variable 'omit' from source: magic vars 30529 1726882709.67339: variable 'omit' from source: magic vars 30529 1726882709.67409: variable 'interface' from source: play vars 30529 1726882709.67423: variable 'omit' from source: magic vars 30529 1726882709.67458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882709.67485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882709.67505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882709.67518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.67529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.67551: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882709.67555: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.67558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.67630: Set connection var ansible_shell_executable to /bin/sh 30529 1726882709.67635: Set connection var ansible_pipelining to False 30529 1726882709.67638: Set connection var ansible_shell_type to sh 30529 1726882709.67646: Set connection var ansible_timeout to 10 30529 1726882709.67648: Set connection var ansible_connection to ssh 30529 1726882709.67653: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882709.67670: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.67673: variable 'ansible_connection' from source: unknown 30529 1726882709.67676: variable 'ansible_module_compression' from source: unknown 30529 1726882709.67678: variable 'ansible_shell_type' from source: unknown 30529 1726882709.67681: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.67683: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.67686: variable 'ansible_pipelining' from source: unknown 30529 1726882709.67692: variable 'ansible_timeout' from source: unknown 30529 1726882709.67696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.67791: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882709.67800: variable 'omit' from source: magic vars 30529 1726882709.67805: starting attempt loop 30529 1726882709.67808: running the handler 30529 1726882709.67907: variable 'interface_stat' from source: set_fact 30529 1726882709.67915: Evaluated conditional (not interface_stat.stat.exists): True 30529 1726882709.67922: handler run complete 30529 1726882709.67934: attempt loop complete, returning result 30529 1726882709.67937: _execute() done 30529 1726882709.67940: dumping result to json 30529 1726882709.67942: done dumping result, returning 30529 1726882709.67948: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'statebr' [12673a56-9f93-b0f1-edc0-0000000028d4] 30529 1726882709.67951: sending task result for task 12673a56-9f93-b0f1-edc0-0000000028d4 30529 1726882709.68034: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000028d4 30529 1726882709.68037: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 30529 1726882709.68081: no more pending results, returning what we have 30529 1726882709.68085: results queue empty 30529 1726882709.68086: checking for any_errors_fatal 30529 1726882709.68101: done checking for any_errors_fatal 30529 1726882709.68101: checking for max_fail_percentage 30529 1726882709.68103: done checking for max_fail_percentage 30529 1726882709.68104: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.68105: done checking to see if all hosts have failed 30529 1726882709.68106: getting the remaining hosts for this loop 30529 1726882709.68108: done getting the remaining hosts for this loop 30529 1726882709.68111: getting the next task for host managed_node1 30529 1726882709.68120: done getting next task for host managed_node1 30529 1726882709.68122: ^ task is: TASK: Success in test '{{ lsr_description }}' 30529 1726882709.68125: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.68130: getting variables 30529 1726882709.68132: in VariableManager get_vars() 30529 1726882709.68176: Calling all_inventory to load vars for managed_node1 30529 1726882709.68178: Calling groups_inventory to load vars for managed_node1 30529 1726882709.68182: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.68198: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.68201: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.68204: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.69004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.69971: done with get_vars() 30529 1726882709.69987: done getting variables 30529 1726882709.70032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 30529 1726882709.70114: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:38:29 -0400 (0:00:00.038) 0:02:03.727 ****** 30529 1726882709.70136: entering _queue_task() for managed_node1/debug 30529 1726882709.70375: worker is 1 (out of 1 available) 30529 1726882709.70392: exiting _queue_task() for managed_node1/debug 30529 1726882709.70406: done queuing things up, now waiting for results queue to drain 30529 1726882709.70409: waiting for pending results... 30529 1726882709.70582: running TaskExecutor() for managed_node1/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 30529 1726882709.70660: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b4 30529 1726882709.70673: variable 'ansible_search_path' from source: unknown 30529 1726882709.70676: variable 'ansible_search_path' from source: unknown 30529 1726882709.70706: calling self._execute() 30529 1726882709.70781: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.70785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.70802: variable 'omit' from source: magic vars 30529 1726882709.71067: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.71075: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.71081: variable 'omit' from source: magic vars 30529 1726882709.71113: variable 'omit' from source: magic vars 30529 1726882709.71181: variable 'lsr_description' from source: include params 30529 1726882709.71274: variable 'omit' from source: magic vars 30529 1726882709.71279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882709.71282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882709.71284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882709.71287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.71292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.71316: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882709.71319: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.71323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.71395: Set connection var ansible_shell_executable to /bin/sh 30529 1726882709.71398: Set connection var ansible_pipelining to False 30529 1726882709.71401: Set connection var ansible_shell_type to sh 30529 1726882709.71410: Set connection var ansible_timeout to 10 30529 1726882709.71412: Set connection var ansible_connection to ssh 30529 1726882709.71414: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882709.71430: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.71433: variable 'ansible_connection' from source: unknown 30529 1726882709.71436: variable 'ansible_module_compression' from source: unknown 30529 1726882709.71438: variable 'ansible_shell_type' from source: unknown 30529 1726882709.71441: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.71443: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.71446: variable 'ansible_pipelining' from source: unknown 30529 1726882709.71448: variable 'ansible_timeout' from source: unknown 30529 1726882709.71453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.71550: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882709.71560: variable 'omit' from source: magic vars 30529 1726882709.71565: starting attempt loop 30529 1726882709.71567: running the handler 30529 1726882709.71608: handler run complete 30529 1726882709.71619: attempt loop complete, returning result 30529 1726882709.71623: _execute() done 30529 1726882709.71626: dumping result to json 30529 1726882709.71628: done dumping result, returning 30529 1726882709.71634: done running TaskExecutor() for managed_node1/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [12673a56-9f93-b0f1-edc0-0000000020b4] 30529 1726882709.71639: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b4 30529 1726882709.71721: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b4 30529 1726882709.71723: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 30529 1726882709.71779: no more pending results, returning what we have 30529 1726882709.71782: results queue empty 30529 1726882709.71783: checking for any_errors_fatal 30529 1726882709.71795: done checking for any_errors_fatal 30529 1726882709.71796: checking for max_fail_percentage 30529 1726882709.71798: done checking for max_fail_percentage 30529 1726882709.71799: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.71799: done checking to see if all hosts have failed 30529 1726882709.71800: getting the remaining hosts for this loop 30529 1726882709.71802: done getting the remaining hosts for this loop 30529 1726882709.71805: getting the next task for host managed_node1 30529 1726882709.71813: done getting next task for host managed_node1 30529 1726882709.71816: ^ task is: TASK: Cleanup 30529 1726882709.71819: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.71824: getting variables 30529 1726882709.71825: in VariableManager get_vars() 30529 1726882709.71861: Calling all_inventory to load vars for managed_node1 30529 1726882709.71863: Calling groups_inventory to load vars for managed_node1 30529 1726882709.71866: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.71876: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.71879: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.71882: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.72667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.73526: done with get_vars() 30529 1726882709.73541: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:38:29 -0400 (0:00:00.034) 0:02:03.762 ****** 30529 1726882709.73604: entering _queue_task() for managed_node1/include_tasks 30529 1726882709.73817: worker is 1 (out of 1 available) 30529 1726882709.73829: exiting _queue_task() for managed_node1/include_tasks 30529 1726882709.73842: done queuing things up, now waiting for results queue to drain 30529 1726882709.73844: waiting for pending results... 30529 1726882709.74009: running TaskExecutor() for managed_node1/TASK: Cleanup 30529 1726882709.74098: in run() - task 12673a56-9f93-b0f1-edc0-0000000020b8 30529 1726882709.74107: variable 'ansible_search_path' from source: unknown 30529 1726882709.74110: variable 'ansible_search_path' from source: unknown 30529 1726882709.74143: variable 'lsr_cleanup' from source: include params 30529 1726882709.74296: variable 'lsr_cleanup' from source: include params 30529 1726882709.74345: variable 'omit' from source: magic vars 30529 1726882709.74442: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.74448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.74456: variable 'omit' from source: magic vars 30529 1726882709.74620: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.74627: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.74634: variable 'item' from source: unknown 30529 1726882709.74680: variable 'item' from source: unknown 30529 1726882709.74704: variable 'item' from source: unknown 30529 1726882709.74749: variable 'item' from source: unknown 30529 1726882709.74872: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.74875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.74878: variable 'omit' from source: magic vars 30529 1726882709.74956: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.74959: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.74964: variable 'item' from source: unknown 30529 1726882709.75011: variable 'item' from source: unknown 30529 1726882709.75030: variable 'item' from source: unknown 30529 1726882709.75073: variable 'item' from source: unknown 30529 1726882709.75139: dumping result to json 30529 1726882709.75142: done dumping result, returning 30529 1726882709.75144: done running TaskExecutor() for managed_node1/TASK: Cleanup [12673a56-9f93-b0f1-edc0-0000000020b8] 30529 1726882709.75145: sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b8 30529 1726882709.75181: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000020b8 30529 1726882709.75183: WORKER PROCESS EXITING 30529 1726882709.75213: no more pending results, returning what we have 30529 1726882709.75218: in VariableManager get_vars() 30529 1726882709.75264: Calling all_inventory to load vars for managed_node1 30529 1726882709.75267: Calling groups_inventory to load vars for managed_node1 30529 1726882709.75270: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.75281: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.75284: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.75287: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.76248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.77078: done with get_vars() 30529 1726882709.77095: variable 'ansible_search_path' from source: unknown 30529 1726882709.77096: variable 'ansible_search_path' from source: unknown 30529 1726882709.77141: variable 'ansible_search_path' from source: unknown 30529 1726882709.77143: variable 'ansible_search_path' from source: unknown 30529 1726882709.77169: we have included files to process 30529 1726882709.77170: generating all_blocks data 30529 1726882709.77172: done generating all_blocks data 30529 1726882709.77176: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882709.77177: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882709.77179: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30529 1726882709.77363: done processing included file 30529 1726882709.77365: iterating over new_blocks loaded from include file 30529 1726882709.77366: in VariableManager get_vars() 30529 1726882709.77382: done with get_vars() 30529 1726882709.77384: filtering new block on tags 30529 1726882709.77412: done filtering new block on tags 30529 1726882709.77414: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node1 => (item=tasks/cleanup_profile+device.yml) 30529 1726882709.77418: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30529 1726882709.77419: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30529 1726882709.77422: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30529 1726882709.77762: done processing included file 30529 1726882709.77764: iterating over new_blocks loaded from include file 30529 1726882709.77765: in VariableManager get_vars() 30529 1726882709.77780: done with get_vars() 30529 1726882709.77782: filtering new block on tags 30529 1726882709.77816: done filtering new block on tags 30529 1726882709.77818: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 => (item=tasks/check_network_dns.yml) 30529 1726882709.77821: extending task lists for all hosts with included blocks 30529 1726882709.78961: done extending task lists 30529 1726882709.78963: done processing included files 30529 1726882709.78963: results queue empty 30529 1726882709.78964: checking for any_errors_fatal 30529 1726882709.78965: done checking for any_errors_fatal 30529 1726882709.78966: checking for max_fail_percentage 30529 1726882709.78967: done checking for max_fail_percentage 30529 1726882709.78967: checking to see if all hosts have failed and the running result is not ok 30529 1726882709.78968: done checking to see if all hosts have failed 30529 1726882709.78968: getting the remaining hosts for this loop 30529 1726882709.78969: done getting the remaining hosts for this loop 30529 1726882709.78971: getting the next task for host managed_node1 30529 1726882709.78973: done getting next task for host managed_node1 30529 1726882709.78975: ^ task is: TASK: Cleanup profile and device 30529 1726882709.78976: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882709.78978: getting variables 30529 1726882709.78979: in VariableManager get_vars() 30529 1726882709.78986: Calling all_inventory to load vars for managed_node1 30529 1726882709.78991: Calling groups_inventory to load vars for managed_node1 30529 1726882709.78995: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882709.78999: Calling all_plugins_play to load vars for managed_node1 30529 1726882709.79000: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882709.79002: Calling groups_plugins_play to load vars for managed_node1 30529 1726882709.79608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882709.80627: done with get_vars() 30529 1726882709.80647: done getting variables 30529 1726882709.80684: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:38:29 -0400 (0:00:00.071) 0:02:03.833 ****** 30529 1726882709.80715: entering _queue_task() for managed_node1/shell 30529 1726882709.81028: worker is 1 (out of 1 available) 30529 1726882709.81040: exiting _queue_task() for managed_node1/shell 30529 1726882709.81053: done queuing things up, now waiting for results queue to drain 30529 1726882709.81054: waiting for pending results... 30529 1726882709.81514: running TaskExecutor() for managed_node1/TASK: Cleanup profile and device 30529 1726882709.81519: in run() - task 12673a56-9f93-b0f1-edc0-00000000299e 30529 1726882709.81522: variable 'ansible_search_path' from source: unknown 30529 1726882709.81524: variable 'ansible_search_path' from source: unknown 30529 1726882709.81526: calling self._execute() 30529 1726882709.81611: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.81624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.81637: variable 'omit' from source: magic vars 30529 1726882709.82014: variable 'ansible_distribution_major_version' from source: facts 30529 1726882709.82033: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882709.82049: variable 'omit' from source: magic vars 30529 1726882709.82155: variable 'omit' from source: magic vars 30529 1726882709.82243: variable 'interface' from source: play vars 30529 1726882709.82271: variable 'omit' from source: magic vars 30529 1726882709.82314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882709.82355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882709.82389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882709.82416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.82439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882709.82481: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882709.82591: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.82596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.82615: Set connection var ansible_shell_executable to /bin/sh 30529 1726882709.82626: Set connection var ansible_pipelining to False 30529 1726882709.82634: Set connection var ansible_shell_type to sh 30529 1726882709.82647: Set connection var ansible_timeout to 10 30529 1726882709.82655: Set connection var ansible_connection to ssh 30529 1726882709.82664: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882709.82691: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.82705: variable 'ansible_connection' from source: unknown 30529 1726882709.82712: variable 'ansible_module_compression' from source: unknown 30529 1726882709.82719: variable 'ansible_shell_type' from source: unknown 30529 1726882709.82725: variable 'ansible_shell_executable' from source: unknown 30529 1726882709.82732: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882709.82808: variable 'ansible_pipelining' from source: unknown 30529 1726882709.82811: variable 'ansible_timeout' from source: unknown 30529 1726882709.82814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882709.82901: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882709.82925: variable 'omit' from source: magic vars 30529 1726882709.82935: starting attempt loop 30529 1726882709.82942: running the handler 30529 1726882709.82958: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882709.82985: _low_level_execute_command(): starting 30529 1726882709.83000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882709.83739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.83753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.83769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.83799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882709.83901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.83918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.84004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.85629: stdout chunk (state=3): >>>/root <<< 30529 1726882709.85734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.85758: stderr chunk (state=3): >>><<< 30529 1726882709.85761: stdout chunk (state=3): >>><<< 30529 1726882709.85781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.85796: _low_level_execute_command(): starting 30529 1726882709.85804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241 `" && echo ansible-tmp-1726882709.8578038-35898-109183001722241="` echo /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241 `" ) && sleep 0' 30529 1726882709.86202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.86212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.86215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.86217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.86257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882709.86261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.86310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.88164: stdout chunk (state=3): >>>ansible-tmp-1726882709.8578038-35898-109183001722241=/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241 <<< 30529 1726882709.88274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.88306: stderr chunk (state=3): >>><<< 30529 1726882709.88309: stdout chunk (state=3): >>><<< 30529 1726882709.88324: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882709.8578038-35898-109183001722241=/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.88352: variable 'ansible_module_compression' from source: unknown 30529 1726882709.88392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882709.88428: variable 'ansible_facts' from source: unknown 30529 1726882709.88480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py 30529 1726882709.88581: Sending initial data 30529 1726882709.88584: Sent initial data (156 bytes) 30529 1726882709.89048: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.89052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882709.89054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.89059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.89061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.89141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.89179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.90689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882709.90700: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882709.90729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882709.90768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpemoh2lwr /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py <<< 30529 1726882709.90776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py" <<< 30529 1726882709.90809: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpemoh2lwr" to remote "/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py" <<< 30529 1726882709.90817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py" <<< 30529 1726882709.91327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.91363: stderr chunk (state=3): >>><<< 30529 1726882709.91366: stdout chunk (state=3): >>><<< 30529 1726882709.91410: done transferring module to remote 30529 1726882709.91420: _low_level_execute_command(): starting 30529 1726882709.91424: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/ /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py && sleep 0' 30529 1726882709.92009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.92028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.92097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882709.93833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882709.93845: stdout chunk (state=3): >>><<< 30529 1726882709.93858: stderr chunk (state=3): >>><<< 30529 1726882709.93879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882709.93892: _low_level_execute_command(): starting 30529 1726882709.93906: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/AnsiballZ_command.py && sleep 0' 30529 1726882709.94504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30529 1726882709.94521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882709.94543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882709.94564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882709.94608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882709.94682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882709.94715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882709.94782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.12959: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:38:30.097411", "end": "2024-09-20 21:38:30.128506", "delta": "0:00:00.031095", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882710.14459: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 30529 1726882710.14485: stderr chunk (state=3): >>><<< 30529 1726882710.14488: stdout chunk (state=3): >>><<< 30529 1726882710.14508: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:38:30.097411", "end": "2024-09-20 21:38:30.128506", "delta": "0:00:00.031095", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 30529 1726882710.14541: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882710.14550: _low_level_execute_command(): starting 30529 1726882710.14555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882709.8578038-35898-109183001722241/ > /dev/null 2>&1 && sleep 0' 30529 1726882710.15010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.15013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882710.15016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.15018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.15020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.15022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.15073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.15081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.15084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.15119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.16927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.16951: stderr chunk (state=3): >>><<< 30529 1726882710.16954: stdout chunk (state=3): >>><<< 30529 1726882710.16966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.16972: handler run complete 30529 1726882710.16988: Evaluated conditional (False): False 30529 1726882710.17002: attempt loop complete, returning result 30529 1726882710.17005: _execute() done 30529 1726882710.17007: dumping result to json 30529 1726882710.17013: done dumping result, returning 30529 1726882710.17020: done running TaskExecutor() for managed_node1/TASK: Cleanup profile and device [12673a56-9f93-b0f1-edc0-00000000299e] 30529 1726882710.17024: sending task result for task 12673a56-9f93-b0f1-edc0-00000000299e 30529 1726882710.17120: done sending task result for task 12673a56-9f93-b0f1-edc0-00000000299e 30529 1726882710.17123: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.031095", "end": "2024-09-20 21:38:30.128506", "rc": 1, "start": "2024-09-20 21:38:30.097411" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30529 1726882710.17183: no more pending results, returning what we have 30529 1726882710.17189: results queue empty 30529 1726882710.17190: checking for any_errors_fatal 30529 1726882710.17191: done checking for any_errors_fatal 30529 1726882710.17192: checking for max_fail_percentage 30529 1726882710.17196: done checking for max_fail_percentage 30529 1726882710.17196: checking to see if all hosts have failed and the running result is not ok 30529 1726882710.17197: done checking to see if all hosts have failed 30529 1726882710.17198: getting the remaining hosts for this loop 30529 1726882710.17200: done getting the remaining hosts for this loop 30529 1726882710.17204: getting the next task for host managed_node1 30529 1726882710.17214: done getting next task for host managed_node1 30529 1726882710.17218: ^ task is: TASK: Check routes and DNS 30529 1726882710.17221: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882710.17225: getting variables 30529 1726882710.17227: in VariableManager get_vars() 30529 1726882710.17275: Calling all_inventory to load vars for managed_node1 30529 1726882710.17277: Calling groups_inventory to load vars for managed_node1 30529 1726882710.17281: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882710.17292: Calling all_plugins_play to load vars for managed_node1 30529 1726882710.17297: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882710.17300: Calling groups_plugins_play to load vars for managed_node1 30529 1726882710.18222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882710.19075: done with get_vars() 30529 1726882710.19091: done getting variables 30529 1726882710.19134: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:38:30 -0400 (0:00:00.384) 0:02:04.217 ****** 30529 1726882710.19157: entering _queue_task() for managed_node1/shell 30529 1726882710.19382: worker is 1 (out of 1 available) 30529 1726882710.19399: exiting _queue_task() for managed_node1/shell 30529 1726882710.19412: done queuing things up, now waiting for results queue to drain 30529 1726882710.19415: waiting for pending results... 30529 1726882710.19591: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 30529 1726882710.19678: in run() - task 12673a56-9f93-b0f1-edc0-0000000029a2 30529 1726882710.19692: variable 'ansible_search_path' from source: unknown 30529 1726882710.19698: variable 'ansible_search_path' from source: unknown 30529 1726882710.19723: calling self._execute() 30529 1726882710.19804: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.19808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.19816: variable 'omit' from source: magic vars 30529 1726882710.20094: variable 'ansible_distribution_major_version' from source: facts 30529 1726882710.20103: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882710.20109: variable 'omit' from source: magic vars 30529 1726882710.20143: variable 'omit' from source: magic vars 30529 1726882710.20165: variable 'omit' from source: magic vars 30529 1726882710.20198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882710.20226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882710.20245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882710.20257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882710.20267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882710.20294: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882710.20299: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.20301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.20368: Set connection var ansible_shell_executable to /bin/sh 30529 1726882710.20372: Set connection var ansible_pipelining to False 30529 1726882710.20374: Set connection var ansible_shell_type to sh 30529 1726882710.20382: Set connection var ansible_timeout to 10 30529 1726882710.20384: Set connection var ansible_connection to ssh 30529 1726882710.20391: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882710.20408: variable 'ansible_shell_executable' from source: unknown 30529 1726882710.20413: variable 'ansible_connection' from source: unknown 30529 1726882710.20416: variable 'ansible_module_compression' from source: unknown 30529 1726882710.20419: variable 'ansible_shell_type' from source: unknown 30529 1726882710.20421: variable 'ansible_shell_executable' from source: unknown 30529 1726882710.20423: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.20425: variable 'ansible_pipelining' from source: unknown 30529 1726882710.20427: variable 'ansible_timeout' from source: unknown 30529 1726882710.20435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.20528: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882710.20538: variable 'omit' from source: magic vars 30529 1726882710.20541: starting attempt loop 30529 1726882710.20544: running the handler 30529 1726882710.20554: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882710.20570: _low_level_execute_command(): starting 30529 1726882710.20577: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882710.21088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.21095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882710.21099: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882710.21102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.21150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.21154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.21160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.21202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.22821: stdout chunk (state=3): >>>/root <<< 30529 1726882710.22917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.22943: stderr chunk (state=3): >>><<< 30529 1726882710.22949: stdout chunk (state=3): >>><<< 30529 1726882710.22965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.22977: _low_level_execute_command(): starting 30529 1726882710.22982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517 `" && echo ansible-tmp-1726882710.2296457-35916-114280755033517="` echo /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517 `" ) && sleep 0' 30529 1726882710.23395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.23406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.23409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.23411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.23455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.23458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.23509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.25368: stdout chunk (state=3): >>>ansible-tmp-1726882710.2296457-35916-114280755033517=/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517 <<< 30529 1726882710.25476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.25506: stderr chunk (state=3): >>><<< 30529 1726882710.25510: stdout chunk (state=3): >>><<< 30529 1726882710.25527: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882710.2296457-35916-114280755033517=/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.25554: variable 'ansible_module_compression' from source: unknown 30529 1726882710.25599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882710.25631: variable 'ansible_facts' from source: unknown 30529 1726882710.25685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py 30529 1726882710.25787: Sending initial data 30529 1726882710.25796: Sent initial data (156 bytes) 30529 1726882710.26240: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.26243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882710.26246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882710.26248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.26250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.26299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.26310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.26349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.27876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30529 1726882710.27880: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882710.27915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882710.27956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpijg9erm4 /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py <<< 30529 1726882710.27959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py" <<< 30529 1726882710.28001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmpijg9erm4" to remote "/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py" <<< 30529 1726882710.28513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.28554: stderr chunk (state=3): >>><<< 30529 1726882710.28557: stdout chunk (state=3): >>><<< 30529 1726882710.28578: done transferring module to remote 30529 1726882710.28588: _low_level_execute_command(): starting 30529 1726882710.28596: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/ /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py && sleep 0' 30529 1726882710.29023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.29026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.29032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.29034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882710.29036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.29083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.29091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.29127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.30844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.30867: stderr chunk (state=3): >>><<< 30529 1726882710.30870: stdout chunk (state=3): >>><<< 30529 1726882710.30884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.30888: _low_level_execute_command(): starting 30529 1726882710.30895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/AnsiballZ_command.py && sleep 0' 30529 1726882710.31318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.31321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882710.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.31325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.31327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882710.31329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.31373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.31376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.31427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.47243: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2476sec preferred_lft 2476sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:30.462435", "end": "2024-09-20 21:38:30.471259", "delta": "0:00:00.008824", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882710.48679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882710.48710: stderr chunk (state=3): >>><<< 30529 1726882710.48713: stdout chunk (state=3): >>><<< 30529 1726882710.48730: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2476sec preferred_lft 2476sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:30.462435", "end": "2024-09-20 21:38:30.471259", "delta": "0:00:00.008824", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882710.48768: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882710.48775: _low_level_execute_command(): starting 30529 1726882710.48780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882710.2296457-35916-114280755033517/ > /dev/null 2>&1 && sleep 0' 30529 1726882710.49233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.49237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.49239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 30529 1726882710.49241: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.49243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.49297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.49305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.49307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.49344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.51118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.51143: stderr chunk (state=3): >>><<< 30529 1726882710.51146: stdout chunk (state=3): >>><<< 30529 1726882710.51158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.51165: handler run complete 30529 1726882710.51182: Evaluated conditional (False): False 30529 1726882710.51195: attempt loop complete, returning result 30529 1726882710.51198: _execute() done 30529 1726882710.51200: dumping result to json 30529 1726882710.51202: done dumping result, returning 30529 1726882710.51210: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [12673a56-9f93-b0f1-edc0-0000000029a2] 30529 1726882710.51214: sending task result for task 12673a56-9f93-b0f1-edc0-0000000029a2 30529 1726882710.51322: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000029a2 30529 1726882710.51325: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008824", "end": "2024-09-20 21:38:30.471259", "rc": 0, "start": "2024-09-20 21:38:30.462435" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2476sec preferred_lft 2476sec inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 30529 1726882710.51405: no more pending results, returning what we have 30529 1726882710.51409: results queue empty 30529 1726882710.51410: checking for any_errors_fatal 30529 1726882710.51420: done checking for any_errors_fatal 30529 1726882710.51421: checking for max_fail_percentage 30529 1726882710.51423: done checking for max_fail_percentage 30529 1726882710.51424: checking to see if all hosts have failed and the running result is not ok 30529 1726882710.51425: done checking to see if all hosts have failed 30529 1726882710.51425: getting the remaining hosts for this loop 30529 1726882710.51427: done getting the remaining hosts for this loop 30529 1726882710.51431: getting the next task for host managed_node1 30529 1726882710.51438: done getting next task for host managed_node1 30529 1726882710.51440: ^ task is: TASK: Verify DNS and network connectivity 30529 1726882710.51445: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882710.51453: getting variables 30529 1726882710.51454: in VariableManager get_vars() 30529 1726882710.51502: Calling all_inventory to load vars for managed_node1 30529 1726882710.51505: Calling groups_inventory to load vars for managed_node1 30529 1726882710.51508: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882710.51519: Calling all_plugins_play to load vars for managed_node1 30529 1726882710.51522: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882710.51524: Calling groups_plugins_play to load vars for managed_node1 30529 1726882710.52349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882710.53327: done with get_vars() 30529 1726882710.53344: done getting variables 30529 1726882710.53386: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:38:30 -0400 (0:00:00.342) 0:02:04.560 ****** 30529 1726882710.53412: entering _queue_task() for managed_node1/shell 30529 1726882710.53646: worker is 1 (out of 1 available) 30529 1726882710.53659: exiting _queue_task() for managed_node1/shell 30529 1726882710.53674: done queuing things up, now waiting for results queue to drain 30529 1726882710.53675: waiting for pending results... 30529 1726882710.53850: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 30529 1726882710.53930: in run() - task 12673a56-9f93-b0f1-edc0-0000000029a3 30529 1726882710.53942: variable 'ansible_search_path' from source: unknown 30529 1726882710.53946: variable 'ansible_search_path' from source: unknown 30529 1726882710.53971: calling self._execute() 30529 1726882710.54056: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.54060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.54069: variable 'omit' from source: magic vars 30529 1726882710.54354: variable 'ansible_distribution_major_version' from source: facts 30529 1726882710.54365: Evaluated conditional (ansible_distribution_major_version != '6'): True 30529 1726882710.54464: variable 'ansible_facts' from source: unknown 30529 1726882710.54939: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 30529 1726882710.54944: variable 'omit' from source: magic vars 30529 1726882710.54977: variable 'omit' from source: magic vars 30529 1726882710.55005: variable 'omit' from source: magic vars 30529 1726882710.55036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30529 1726882710.55063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30529 1726882710.55079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30529 1726882710.55095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882710.55108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30529 1726882710.55131: variable 'inventory_hostname' from source: host vars for 'managed_node1' 30529 1726882710.55134: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.55136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.55212: Set connection var ansible_shell_executable to /bin/sh 30529 1726882710.55217: Set connection var ansible_pipelining to False 30529 1726882710.55221: Set connection var ansible_shell_type to sh 30529 1726882710.55230: Set connection var ansible_timeout to 10 30529 1726882710.55233: Set connection var ansible_connection to ssh 30529 1726882710.55237: Set connection var ansible_module_compression to ZIP_DEFLATED 30529 1726882710.55253: variable 'ansible_shell_executable' from source: unknown 30529 1726882710.55256: variable 'ansible_connection' from source: unknown 30529 1726882710.55259: variable 'ansible_module_compression' from source: unknown 30529 1726882710.55261: variable 'ansible_shell_type' from source: unknown 30529 1726882710.55263: variable 'ansible_shell_executable' from source: unknown 30529 1726882710.55265: variable 'ansible_host' from source: host vars for 'managed_node1' 30529 1726882710.55270: variable 'ansible_pipelining' from source: unknown 30529 1726882710.55272: variable 'ansible_timeout' from source: unknown 30529 1726882710.55276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 30529 1726882710.55375: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882710.55384: variable 'omit' from source: magic vars 30529 1726882710.55389: starting attempt loop 30529 1726882710.55396: running the handler 30529 1726882710.55406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 30529 1726882710.55423: _low_level_execute_command(): starting 30529 1726882710.55431: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30529 1726882710.55931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.55935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.55938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.55940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.55985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.56008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.56046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.57638: stdout chunk (state=3): >>>/root <<< 30529 1726882710.57732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.57756: stderr chunk (state=3): >>><<< 30529 1726882710.57759: stdout chunk (state=3): >>><<< 30529 1726882710.57779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.57789: _low_level_execute_command(): starting 30529 1726882710.57800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267 `" && echo ansible-tmp-1726882710.5777664-35924-120395520168267="` echo /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267 `" ) && sleep 0' 30529 1726882710.58215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.58226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.58230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.58232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.58272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.58275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.58323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.60181: stdout chunk (state=3): >>>ansible-tmp-1726882710.5777664-35924-120395520168267=/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267 <<< 30529 1726882710.60292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.60317: stderr chunk (state=3): >>><<< 30529 1726882710.60321: stdout chunk (state=3): >>><<< 30529 1726882710.60333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882710.5777664-35924-120395520168267=/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.60358: variable 'ansible_module_compression' from source: unknown 30529 1726882710.60400: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30529ykg6b3r2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30529 1726882710.60429: variable 'ansible_facts' from source: unknown 30529 1726882710.60481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py 30529 1726882710.60574: Sending initial data 30529 1726882710.60577: Sent initial data (156 bytes) 30529 1726882710.61004: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.61007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882710.61009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30529 1726882710.61012: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882710.61013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.61060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.61067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.61106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.62607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30529 1726882710.62611: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30529 1726882710.62646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30529 1726882710.62687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp4zhk9agd /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py <<< 30529 1726882710.62692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py" <<< 30529 1726882710.62726: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30529ykg6b3r2/tmp4zhk9agd" to remote "/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py" <<< 30529 1726882710.63241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.63282: stderr chunk (state=3): >>><<< 30529 1726882710.63286: stdout chunk (state=3): >>><<< 30529 1726882710.63317: done transferring module to remote 30529 1726882710.63326: _low_level_execute_command(): starting 30529 1726882710.63330: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/ /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py && sleep 0' 30529 1726882710.63769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.63773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.63779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 30529 1726882710.63781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30529 1726882710.63783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.63830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.63837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882710.63839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.63880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882710.65570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882710.65600: stderr chunk (state=3): >>><<< 30529 1726882710.65603: stdout chunk (state=3): >>><<< 30529 1726882710.65618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882710.65621: _low_level_execute_command(): starting 30529 1726882710.65626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/AnsiballZ_command.py && sleep 0' 30529 1726882710.66056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882710.66059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.66062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882710.66064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 30529 1726882710.66066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882710.66103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882710.66116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882710.66170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882711.01879: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4132 0 --:--:-- --:--:-- --:--:-- 4178\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2610 0 --:--:-- --:--:-- --:--:-- 2621", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:30.809668", "end": "2024-09-20 21:38:31.017498", "delta": "0:00:00.207830", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30529 1726882711.03453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 30529 1726882711.03480: stderr chunk (state=3): >>><<< 30529 1726882711.03483: stdout chunk (state=3): >>><<< 30529 1726882711.03511: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4132 0 --:--:-- --:--:-- --:--:-- 4178\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2610 0 --:--:-- --:--:-- --:--:-- 2621", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:30.809668", "end": "2024-09-20 21:38:31.017498", "delta": "0:00:00.207830", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 30529 1726882711.03547: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30529 1726882711.03553: _low_level_execute_command(): starting 30529 1726882711.03558: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882710.5777664-35924-120395520168267/ > /dev/null 2>&1 && sleep 0' 30529 1726882711.04011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30529 1726882711.04015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 30529 1726882711.04018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30529 1726882711.04020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30529 1726882711.04072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 30529 1726882711.04075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30529 1726882711.04081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30529 1726882711.04124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30529 1726882711.05944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30529 1726882711.05972: stderr chunk (state=3): >>><<< 30529 1726882711.05976: stdout chunk (state=3): >>><<< 30529 1726882711.05992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30529 1726882711.05997: handler run complete 30529 1726882711.06014: Evaluated conditional (False): False 30529 1726882711.06023: attempt loop complete, returning result 30529 1726882711.06025: _execute() done 30529 1726882711.06028: dumping result to json 30529 1726882711.06033: done dumping result, returning 30529 1726882711.06040: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [12673a56-9f93-b0f1-edc0-0000000029a3] 30529 1726882711.06044: sending task result for task 12673a56-9f93-b0f1-edc0-0000000029a3 30529 1726882711.06148: done sending task result for task 12673a56-9f93-b0f1-edc0-0000000029a3 30529 1726882711.06151: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.207830", "end": "2024-09-20 21:38:31.017498", "rc": 0, "start": "2024-09-20 21:38:30.809668" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 4132 0 --:--:-- --:--:-- --:--:-- 4178 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2610 0 --:--:-- --:--:-- --:--:-- 2621 30529 1726882711.06228: no more pending results, returning what we have 30529 1726882711.06232: results queue empty 30529 1726882711.06233: checking for any_errors_fatal 30529 1726882711.06243: done checking for any_errors_fatal 30529 1726882711.06243: checking for max_fail_percentage 30529 1726882711.06245: done checking for max_fail_percentage 30529 1726882711.06246: checking to see if all hosts have failed and the running result is not ok 30529 1726882711.06247: done checking to see if all hosts have failed 30529 1726882711.06247: getting the remaining hosts for this loop 30529 1726882711.06249: done getting the remaining hosts for this loop 30529 1726882711.06257: getting the next task for host managed_node1 30529 1726882711.06267: done getting next task for host managed_node1 30529 1726882711.06270: ^ task is: TASK: meta (flush_handlers) 30529 1726882711.06271: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882711.06276: getting variables 30529 1726882711.06278: in VariableManager get_vars() 30529 1726882711.06322: Calling all_inventory to load vars for managed_node1 30529 1726882711.06325: Calling groups_inventory to load vars for managed_node1 30529 1726882711.06328: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882711.06339: Calling all_plugins_play to load vars for managed_node1 30529 1726882711.06342: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882711.06344: Calling groups_plugins_play to load vars for managed_node1 30529 1726882711.07184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882711.08051: done with get_vars() 30529 1726882711.08067: done getting variables 30529 1726882711.08120: in VariableManager get_vars() 30529 1726882711.08130: Calling all_inventory to load vars for managed_node1 30529 1726882711.08132: Calling groups_inventory to load vars for managed_node1 30529 1726882711.08133: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882711.08136: Calling all_plugins_play to load vars for managed_node1 30529 1726882711.08138: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882711.08139: Calling groups_plugins_play to load vars for managed_node1 30529 1726882711.08866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882711.09714: done with get_vars() 30529 1726882711.09732: done queuing things up, now waiting for results queue to drain 30529 1726882711.09733: results queue empty 30529 1726882711.09734: checking for any_errors_fatal 30529 1726882711.09736: done checking for any_errors_fatal 30529 1726882711.09736: checking for max_fail_percentage 30529 1726882711.09737: done checking for max_fail_percentage 30529 1726882711.09738: checking to see if all hosts have failed and the running result is not ok 30529 1726882711.09738: done checking to see if all hosts have failed 30529 1726882711.09739: getting the remaining hosts for this loop 30529 1726882711.09740: done getting the remaining hosts for this loop 30529 1726882711.09742: getting the next task for host managed_node1 30529 1726882711.09745: done getting next task for host managed_node1 30529 1726882711.09746: ^ task is: TASK: meta (flush_handlers) 30529 1726882711.09747: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882711.09749: getting variables 30529 1726882711.09749: in VariableManager get_vars() 30529 1726882711.09756: Calling all_inventory to load vars for managed_node1 30529 1726882711.09757: Calling groups_inventory to load vars for managed_node1 30529 1726882711.09759: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882711.09763: Calling all_plugins_play to load vars for managed_node1 30529 1726882711.09764: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882711.09766: Calling groups_plugins_play to load vars for managed_node1 30529 1726882711.10396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882711.11217: done with get_vars() 30529 1726882711.11231: done getting variables 30529 1726882711.11264: in VariableManager get_vars() 30529 1726882711.11272: Calling all_inventory to load vars for managed_node1 30529 1726882711.11273: Calling groups_inventory to load vars for managed_node1 30529 1726882711.11275: Calling all_plugins_inventory to load vars for managed_node1 30529 1726882711.11278: Calling all_plugins_play to load vars for managed_node1 30529 1726882711.11279: Calling groups_plugins_inventory to load vars for managed_node1 30529 1726882711.11280: Calling groups_plugins_play to load vars for managed_node1 30529 1726882711.11958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30529 1726882711.12801: done with get_vars() 30529 1726882711.12819: done queuing things up, now waiting for results queue to drain 30529 1726882711.12820: results queue empty 30529 1726882711.12821: checking for any_errors_fatal 30529 1726882711.12822: done checking for any_errors_fatal 30529 1726882711.12822: checking for max_fail_percentage 30529 1726882711.12823: done checking for max_fail_percentage 30529 1726882711.12823: checking to see if all hosts have failed and the running result is not ok 30529 1726882711.12824: done checking to see if all hosts have failed 30529 1726882711.12824: getting the remaining hosts for this loop 30529 1726882711.12825: done getting the remaining hosts for this loop 30529 1726882711.12826: getting the next task for host managed_node1 30529 1726882711.12828: done getting next task for host managed_node1 30529 1726882711.12829: ^ task is: None 30529 1726882711.12830: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30529 1726882711.12831: done queuing things up, now waiting for results queue to drain 30529 1726882711.12831: results queue empty 30529 1726882711.12832: checking for any_errors_fatal 30529 1726882711.12832: done checking for any_errors_fatal 30529 1726882711.12832: checking for max_fail_percentage 30529 1726882711.12833: done checking for max_fail_percentage 30529 1726882711.12834: checking to see if all hosts have failed and the running result is not ok 30529 1726882711.12834: done checking to see if all hosts have failed 30529 1726882711.12835: getting the next task for host managed_node1 30529 1726882711.12837: done getting next task for host managed_node1 30529 1726882711.12837: ^ task is: None 30529 1726882711.12838: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=333 changed=10 unreachable=0 failed=0 skipped=313 rescued=0 ignored=9 Friday 20 September 2024 21:38:31 -0400 (0:00:00.595) 0:02:05.155 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.79s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.70s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.43s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.99s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 30529 1726882711.13044: RUNNING CLEANUP